Twitter Facebook Google RSS

Defense Needs Better Ways to Test Software 

10  2,013 

By Bernie Gauf 

Developing software for the Defense Department has many inherent challenges, not the least of which is testing. Traditional software testing for defense systems consumes up to 50 percent of development resources. Yet, it is only during this phase that engineers can be assured systems are ready for deployment. Because of the critical nature of military systems, corners cannot be cut.

In brief, the traditional approach to software testing involves manually creating and running a wide range of tests at all stages of development to ensure that the system requirements have been successfully incorporated. Additionally, tests must be run to confirm that new software works properly with the software and systems already in place.

With the current manual testing approach, tests are typically documented using a word processor or spreadsheet application, with a step-by-step procedure describing operator actions or input and expected response. Test procedures also describe how the system is required to be configured prior to conducting the test. The pass/fail status of each step is usually written down by the test engineer on a printed hard copy of the test procedure. Every time a test is run, the test engineer executes each step of the test procedure and records the results. It is a labor-intensive process.

Innovations in software testing technology provide alternatives to current testing methods. A case in point is automated software testing.

As with manual processes, automated software testing requires engineers to design tests that support the verification of requirements. Automated testing is also similar to manual testing in that the test program is dependent on the design of high-quality tests.

Automated software testing enables the operator actions or input, along with the expected response, to be digitally captured or recorded. When the test engineer executes an automated test, the technology sends the digitally captured operator actions and input to the system under test and evaluates the response against the expected results. A report is automatically generated to document the results. In order to execute the test, the engineer simply launches the automated test and is not required to manually conduct each step. 

In order to transform testing, this new technology must be embraced. 

Change is slow. Sometimes the best ideas take years to catch on. Engineers engaged in software development may believe, and rightly so, that they are part of a highly progressive industry. But even there one finds resistance to change.

Progress always involves risk. That is nothing new. In the early 1900s, Henry Ford introduced the assembly line to automobile production. It was novel. It was threatening. The unions didn’t like it and were worried it would lead to job loss. But this early form of automation boosted production and cut assembly time in half, resulting in improved efficiency, better quality and increased affordability. Within five years, Ford’s process revolutionized the automobile manufacturing industry.

Fifty years later, George Devol designed the first programmable robotic arm. In the early 1960s, this device transported die castings in a General Motors plant in New Jersey. Initially seen as a curiosity, robots also prompted speculation. Would they eventually replace the common worker? It was too early to tell; regardless, robotics soon became another important advance for the manufacturing sector.

As a nation, we say that progress in science, technology, engineering and math (STEM) is crucial to America’s future. This certainly has been true in the past. Manufacturing was forever altered by the introduction of automation and robotics. However, if we truly believe this, then we need to embrace the outcome of advances in STEM going forward, and apply them to the development of our military’s systems. 

Engineering teams could benefit from embracing new technology — specifically the advances now available in software testing.

Program managers and engineers involved in large, highly complex software projects for the Defense Department devote their resources to planning, designing, developing, and testing. Many highly skilled engineers are required to conduct the testing and to analyze the results. Following the traditional industry approach, software testing consumes, on average, more than half of a project’s schedule and resources.

In the current budget environment, the pressure is on to streamline the software development process and reduce headcount. As a result, fewer engineers will be available to conduct tests. However, the need to verify system requirements and performance has not changed.

Going forward, limited resources will mandate that systems have longer lifespans, increasing the need for more regression tests over time. In addition, the pace of software updates will likely increase, further augmenting the regression testing workload.

Automated software testing technology accelerates execution and reporting time while expanding test coverage. For today’s large, complex, mission-critical systems, this translates to a significant increase in testing efficiency.

The majority of tests that are run manually can be automated, with an expected increase in testing productivity of about 75 percent. Automation can implement a broad range of tests and easily repeat them, covering multiple combinations and increasing the amount of testing completed. Generally, a high percentage of the tests that are part of a manual testing program — functional, performance, concurrency and stress — can be automated.

Automated tests readily produce documented, objective, quality evidence, including requirements traceability and comprehensive pass/fail results. They provide the capability to verify thousands to millions of test permutations in minutes to hours.

Implementing automated testing does involve an initial commitment to design. But instead of defining and executing every test step and command manually, over and over, creating an automated testing solution requires that focus only once. Upon completion, a well-thought-out automated testing program can support the verification of multiple baselines and configurations.

Just as early manufacturing was transformed by innovators who were willing to forge a new and different path to productivity, so too those who incorporate automated testing into software development will lead us squarely into the future. We need to accelerate the adoption of innovation. Forward-thinking military leaders, program managers and engineers who are still utilizing traditional methods of software testing should embrace this progressive technology.

Bernie Gauf is CEO of Innovative Defense Technologies, an information technology business headquartered in Arlington, Va., and co-author of the book Implementing Automated Software Testing (Addison-Wesley, 2009).

Reader Comments

Re: Defense Needs Better Ways to Test Software

Prof. Offutt, I completely agree with your assessment: Automated testing or any type of testing are not activities for unskilled labor.

Elfriede Dustin on 09/28/2013 at 15:30

Re: Defense Needs Better Ways to Test Software

Many companies I've worked with view testing as an activity for unskilled labor--people with no math or programming skills. I've noted this especially in government contractors. And yes, you need math and programming skills to automate your tests.

Jeff Offutt on 09/24/2013 at 13:00

Re: Defense Needs Better Ways to Test Software

Ken, that is a good question. Why are people so slow to adopt test automation? I would be curious to hear from other readers on this. Automated testing has been the focus of my work for many years. It seems that spreading the word about it is half the battle!

Elfriede Dustin, Solutions Director, Innovative Defense Technologies

Elfriede Dustin on 09/20/2013 at 17:15

Re: Defense Needs Better Ways to Test Software

Automated test software is exactly what we used on our portion of the Boeing 777. Software CM controlled the whole process. Code, documents, tests, all checked out, compiled, tested, and documented back into CM with out user interface. Great results, of course that was 15 years ago now. I wonder why others don't do it?

Ken Pinard on 09/14/2013 at 17:44

Submit Your Reader's Comment Below
The content of this field is kept private and will not be shown publicly.
Please enter the text displayed in the image.
The picture contains 6 characters.
*Legal Notice

NDIA is not responsible for screening, policing, editing, or monitoring your or another user's postings and encourages all of its users to use reasonable discretion and caution in evaluating or reviewing any posting. Moreover, and except as provided below with respect to NDIA's right and ability to delete or remove a posting (or any part thereof), NDIA does not endorse, oppose, or edit any opinion or information provided by you or another user and does not make any representation with respect to, nor does it endorse the accuracy, completeness, timeliness, or reliability of any advice, opinion, statement, or other material displayed, uploaded, or distributed by you or any other user. Nevertheless, NDIA reserves the right to delete or take other action with respect to postings (or parts thereof) that NDIA believes in good faith violate this Legal Notice and/or are potentially harmful or unlawful. If you violate this Legal Notice, NDIA may, in its sole discretion, delete the unacceptable content from your posting, remove or delete the posting in its entirety, issue you a warning, and/or terminate your use of the NDIA site. Moreover, it is a policy of NDIA to take appropriate actions under the Digital Millennium Copyright Act and other applicable intellectual property laws. If you become aware of postings that violate these rules regarding acceptable behavior or content, you may contact NDIA at 703.522.1820.

  Bookmark and Share