Twitter Facebook Google RSS
National Defense > Blog > Posts > Procurement Reforms Reignite Feud Between Weapon Buyers and Testers
Procurement Reforms Reignite Feud Between Weapon Buyers and Testers
By Sandra I. Erwin

Pentagon procurement chief Frank Kendall is proposing changes in how weapon systems are tested. He suggests tests should be performed earlier in the design cycle than is customarily done.

The sooner the testing, he says, the sooner the Pentagon will catch problems before the military sinks huge amounts of money into a program. This would help avert expensive redesigns and modifications — a costly lesson the Pentagon learned over the past decade from the F-35 fighter program.

Kendall also believes that earlier "developmental" testing can help reduce the cost of "operational" testing — realistic live-fire drills that are mandated by law before any weapon systems goes into production.

"We are trying to have more efficient test programs overall, get the data out before we make production decisions. That's critical to design stability," Kendall told National Defense after delivering a speech at a recent conference on acquisition reform. Program managers should have more data about how their systems perform before they begin operational tests, Kendall said. "We will continue to try to blend operational testing and developmental testing."

But the Pentagon's plan to wring out more "efficiency" from testing has stirred old animosities between the procurement shop and the office of the director of operational test and evaluation — which operates independently from the procurement office and reports directly to the secretary of defense. DOT&E, as the testing office is known, has been a thorn in the side of many big-ticket weapon programs. Kendall's comments are raising fears in the testing community that their budgets will be gutted.

Acquisition executives from every branch of the military have chafed at being blindsided by DOT&E reports. For many years, program managers have sought to have more control over test reports before DOT&E releases them to Congress and the news media. Procurement officials would rather have test results reported directly to them and have greater say in what information is disclosed.

After operational testers gave the Navy’s littoral combat ship a scathing review in their fiscal year 2012 annual report to Congress, service officials were unprepared for the political damage the report would cause. Testers concluded the ship lacked firepower and was not survivable in high intensity conflict. The report prompted an extensive review of the LCS and the Pentagon ordered the Navy to consider other alternatives.

Kendall's deputy, Darlene Costello, in a speech to a test-and-evaluation industry conference last month, explained the rationale for planned changes related to weapon tests. There is now a "big emphasis on what we do before an RFP goes out. ... Testing is a big part of that," said Costello, who is director of program and acquisition management. 

An RFP, or request for proposals, details specific requirements that contractors must satisfy. Costello said earlier testing would help military buyers understand the state of technology and set more realistic expectations before they commit to a weapons contract. This is a "big area associated with testing that Secretary Kendall wants to emphasize," she said. "I have sincere appreciation for the relationship between testing and acquisition." She also noted that in these times of declining budgets, tension between the two communities tends to rise as everyone competes for a slice of a shrinking pie.

Speaking at the same conference, Director of Operational Test and Evaluation J. Michael Gilmore pushed back on the notion that testing costs should be treated as expendable overhead. "How are you going to compress testing in this era of constrained budgets? I think it's a mistake. It accepts the premise that testing is driving increased cost," he said. "The facts don't support that premise. We want to make sure we do testing as rigorously and as often as we can."

Infighting between program officers and testers is par for the course at the Defense Department. Kendall's predecessor Ashton Carter commissioned an independent team in 2011 to probe complaints that developmental and operational testing contribute to excessive cost and schedule slippages in programs.

A review of 40 programs that had experienced significant delays found that only in seven were tests to blame. It concluded that "testing and test requirements, by themselves, do not generally cause major program delays.” It found no significant evidence that the testing community is responsible for unplanned requirements, cost or schedule problems. In 37 programs, delays were caused by issues that were discovered during testing.

In a June 2011 letter to senior officials, Carter called for a better "relationship and interaction among the testing, requirements and program management communities. ... Although the independent review found that tensions between programs and the test community are to be expected and healthy for the most part, occasionally the tensions grow to the point of frustration and animosity."

In his recent speech at the industry conference, Gilmore made a case that testers tend to become convenient scapegoats but are hardly to blame for cost and schedule overruns.

At the root of the problems that have plagued major Pentagon programs is the way the military services define their requirements, said Gilmore. "Oftentimes requirements are defined in technical specifications. That's OK, but insufficient to ensure a system provides military utility."

He cited the Navy's P-8 maritime surveillance aircraft as a case in point. In operational tests last year, the aircraft showed it could fly, but it was not able to perform key missions like wide-area antisubmarine surveillance. Gilmore blamed the flap on the Navy because it had not specified antisubmarine warfare as a "key performance parameter." In an operational test, said Gilmore, the aircraft needs to do much more than just fly.

Poorly written requirements continue to haunt programs, he said. "In this wonderful town, common sense doesn't play a role." Gilmore said many of the key parameters for the F-35 joint strike fighter relate to aircraft performance and payload capacity. "If we were just going to test KPPs, we would not fly combat missions, we would not penetrate air defenses, we would just fly off the carrier and back. ... How meaningful are these requirements?"

Many of the more complex mission systems have yet to be tested in the F-35, even though the program is already in production. "The Defense Department is committed to the program," he said. "The department has no choice but to make the program work. I don't see that the program is going to be derailed by any negative test results," he said. "My guess is that if there are problems in testing that require hardware changes, those will come in later blocks."

Gilmore rejects the argument often made by procurement officials that a system can be "fixed over time."

The Army, he said, wasted billions of dollars on a future combat system and on digital radios that never materialized. Its leaders were guilty of "approving requirements that are not achievable." Some programs get to operational testing and still don't have concepts for how they will operate, he said. "If the testing community played a more prominent role in requirements — and that's a big if — perhaps we could have avoided these mistakes," Gilmore said.

Laying the blame for added cost and delays on testers is a common practice, he said. But programs should not skimp on testing, Gilmore warned. "In this budget environment, testing, especially operational testing, assuming it's feasible, is essential. When people claim that testers add billions of dollars to programs, the facts don't support that claim. ... The earlier you test, the sooner you fix programs, the lower the cost."

Gilmore suggested major programs should have a firm "test and evaluation master plan" before an RFP is written. "I have never understood when I am told we cannot do a T&E master plan until we get a response back from contractors. How can you generate a meaningful RFP without a draft test plan? Just as importantly, how can you evaluate the responses industry provides? I don't get it. What I fear is that some of these RFP evaluations are check mark exercises. I hope that's not the case."

It is still not clear how Kendall’s plan to accelerate tests will be put into practice. Operational tests usually happen late in the development cycle — sometimes just months before the Pentagon must make a decision to begin production. It can take years to have prototypes available, so unless prototypes are built faster, experts contend, there's only so much operationally realistic testing that can be done earlier in a program.

Of greater concern to testers is how program managers will deal with the current budget crunch. When funds are slashed, program managers historically have pared back testing, especially developmental testing. Gilmore and others have argued that, as budgets come down, the Pentagon needs more testing because it cannot afford to make costly mistakes.

Credit: Frank Kendall speaks at a roll-out ceremony for the first two F-35s for the Royal Australian Air Force (Defense Dept. photo)


Re: Procurement Reforms Reignite Feud Between Weapon Buyers and Testers

Kendall and Gilmore are right on the mark.  T&E should start early and continue through the entire acquisition process to ensure that design issues are discovered and corrected early before production begins.  PMs and Testers are partners in producing the best products for our fighting forces and "furstration and animosity" has no place in the acquisition process.  Maybe we should reevaluate the metrics and motivations which define "program success".
Tom at 8/14/2014 2:45 PM

Add Comment

Items on this list require content approval. Your submission will not appear in public views until approved by someone with proper rights. More information on content approval.

Name: *

eMail *

Comment *



Name: *

eMail *

Comment *


Please enter the text displayed in the image.
The picture contains 6 characters.

Characters *


Legal Notice *

NDIA is not responsible for screening, policing, editing, or monitoring your or another user's postings and encourages all of its users to use reasonable discretion and caution in evaluating or reviewing any posting. Moreover, and except as provided below with respect to NDIA's right and ability to delete or remove a posting (or any part thereof), NDIA does not endorse, oppose, or edit any opinion or information provided by you or another user and does not make any representation with respect to, nor does it endorse the accuracy, completeness, timeliness, or reliability of any advice, opinion, statement, or other material displayed, uploaded, or distributed by you or any other user. Nevertheless, NDIA reserves the right to delete or take other action with respect to postings (or parts thereof) that NDIA believes in good faith violate this Legal Notice and/or are potentially harmful or unlawful. If you violate this Legal Notice, NDIA may, in its sole discretion, delete the unacceptable content from your posting, remove or delete the posting in its entirety, issue you a warning, and/or terminate your use of the NDIA site. Moreover, it is a policy of NDIA to take appropriate actions under the Digital Millennium Copyright Act and other applicable intellectual property laws. If you become aware of postings that violate these rules regarding acceptable behavior or content, you may contact NDIA at 703.522.1820.



Bookmark and Share