Escalating procurement costs and setbacks in the development of new weapons systems partly can be blamed on the Defense Department’s failure to exploit simulation and modeling technologies, asserts the Pentagon’s former testing director.
“I’ve always been puzzled by the fact the modeling and simulation is so problematic in the Defense Department,” says Philip E. Coyle.
A former director of operational test and evaluation at the Defense Department, Coyle contends that the Pentagon, unlike other government organizations, has spent millions of dollars on simulations but has not taken advantage of the technology as a useful tool to test and assess weapons systems before they are produced.
The Defense Department and the services “regularly make high sounding pronouncements that modeling and simulation is going to be the answer and the greatest thing since sliced bread … but it is not easy to find examples in the Defense Department where M&S has really made a difference,” Coyle says in a February speech to the National Defense Industrial Association test and evaluation conference.
By comparison, agencies such as the Lawrence Livermore National Laboratory have proved that modeling, simulation and testing can make a “very happy marriage,” says Coyle. At the lab, it is “literally unthinkable that you would spend millions of dollars on a test without making an equivalent effort first in M&S.”
One issue is a cultural bias at the Defense Department that views computer models as vehicles to justify programs, rather than as tools to better understand the technology. “The focus in defense acquisition is on buying something and moving on, not on understanding for its own sake … Detailed scientific and technical understanding is not the first priority,” says Coyle. “By contrast, the culture in the development of nuclear weapons has been to achieve first-principles understanding of everything … Without those models, the Department of Energy weapons labs would be quite helpless today.”
Another reason why simulations often are shunned by defense program managers is that they don’t want to risk delaying production schedules when technical glitches pop up in computer models, Coyle says. “The incentives are to get the system into production with as little perturbation as possible. In the Department of Energy, it is assumed that you will spend money on M&S, lots of money, and it is budgeted every year … There is no expectation that you can trade off M&S for testing; they go together. So there is no PM who is trying to cut testing by spending money on M&S.”
The goal for modeling and simulation in the Department of Energy is to be able to “predict with rather astonishing accuracy what will happen,” he says. “This means that modeling and simulation, and the evaluations that come from those models, may produce bad news.” At the Defense Department, “the tendency is to expect that test and evaluation will produce bad news and that M&S will produce good news. Thus M&S is often recommended as the better choice.”
This good news, however, may not have anything to do with reality, asserts Coyle, “especially if the models being used were first developed — as is often the case — to sell the system, not to understand it in technical detail.”
In the DOE labs, if a model produced bad news, “you wouldn’t just change the model so it produced happier results, and then stop. This is something I’ve seen done in the Department of Defense.”
These issues are symptomatic in many ways of a trend toward less stringent testing standards, Coyle suggests. The Government Accountability Office and the Defense Science Board, among others, have argued for the need for improved oversight in weapon programs, earlier and more rigorous testing, and reliance on computer models to identify technology risks.
One problem often encountered with models is that contractor-developed simulations are designed to sell a system to Congress and to senior Defense Department leadership, says Coyle. “They are treated as proprietary under many contracts even though those models are then used, sometimes misused, in the development of a system.”