Seal of the Department of Defense U.S. Department of Defense
Office of the Assistant Secretary of Defense (Public Affairs)
Speech
On the Web:
http://www.defense.gov/Speeches/Speech.aspx?SpeechID=985
Media contact: +1 (703) 697-5131/697-5132
Public contact:
http://www.defense.gov/landing/comment.aspx
or +1 (703) 571-3343

Reinventing DoD Test and Evaluation
Prepared remarks of Paul G. Kaminski, undersecretary of defense for acquisition and technology, International Test and Evaluation Association Symposium, Huntsville, Ala.,, Tuesday, October 03, 1995

It is a great pleasure to be with you here in Huntsville. I always enjoy "escaping" the confines of the Washington, D.C., beltway. It allows me to meet you -- the real practitioners of our trade -- and to get firsthand exposure to your issues and your views. I would like to take this opportunity to thank the International Test and Evaluation Association for extending me an invitation to share some of my views on reinventing T&E.

Let me begin by sharing a reinvention success story with you. It's the story of Team New Zealand -- the America's Cup sailing team who defeated Team Dennis Connor five races to zero with overwhelming margins in each race. Historically, teams from large countries have leveraged their nations' manufacturing bases and technological resources to dominate the competition. So how did a team from a small country such as New Zealand triumph in a sport driven by advanced technology?

Team New Zealand gained a competitive advantage by reinventing the yacht design process. Much like the acquisition programs in the Department of Defense, they needed to meet extremely demanding schedules, work within a constrained budget and deliver superior performance. The analysis and optimization of yacht design has traditionally relied upon the testing of scale models in water towing tanks and wind tunnels. Each test requires the construction of a new, precisely machined prototype, and the testing itself can take weeks.

Unlike the larger America's Cup competitors, Team New Zealand did not have corporate sponsorship to obtain ready access to expensive wind tunnels, towing tanks or supercomputers. Instead, Team New Zealand used less expensive work stations to create and drive its own simulation-based process of design, analyze, test, feedback and redesign. Moreover, by locating its computer network at the team's sailing facility, they were able to tightly integrate the designers, testers and sailing crew in a cohesive team.

As many as several hundred simulation designs were analyzed each night. The next morning, they chose the two best for a component and had them manufactured in the machine shop next door, installed on two identical boats and raced to test which performed better. With the aid of the simulation, they isolated which factors helped the winning boat go faster and which ones slowed the loser down. The designers, testers and sailing crew worked side by side to perform about 10,000 simulated iterations over a two-month period. By doing so, they created a superior capability, affordably and in less time than their competitors.

What are some of the conclusions that one can draw from the Team New Zealand reinvention success story? The most important is the bottom line: For me, it's the undeniable fact that Team New Zealand created a superior capability, affordably and in less time than their competitors. When you stop to think about it, that's our bottom line too! In defense acquisition, our job is to put equipment that is second to none in the hands of America's warfighters as quickly and inexpensively as possible. If we fail, the stakes are a lot higher. Instead of losing a yacht race, Americans could lose their lives in battle.

The real issue here is how do we reduce acquisition cycle time to maintain the technological superiority of our combat forces. The Department of Defense cannot afford a 15-year acquisition cycle time when the comparable commercial turnover is every three to four years. Without a doubt, our No. 1 priority must be to shorten the cycle time for developing new weapon systems or inserting new technology into existing systems. In a global market, everyone, including our potential adversaries, will gain increasing access to the same commercial technology base. The military advantage goes to the nation who has the best cycle time to capture technologies that are commercially available, incorporate them in weapon systems and get them fielded first.

Shortened acquisition cycle times will reduce cost, increase management continuity and limit the opportunity for "mischief" to occur on a program. By mischief, I mean the kind of second guessing and changes in program direction that can -- and do -- occur when key decision makers move on to new jobs during a long acquisition cycle. When this happens, we end up causing the entire contractor's team to wait while we shift gears, replan the program and strike out on a new course.

As the department's senior acquisition executive, I have an opportunity to review the status of all the department's major acquisition programs on a quarterly basis. More often than not, T&E-related problems are cited as a major culprit causing many of our programs to experience excessive schedule delays and enter this downward mischief spiral. Invariably, either there is a problem with the availability of a test asset or there is an ongoing, unresolved argument about a measure of operational effectiveness or a T&E plan is not available to support a milestone decision.

I am not attributing these problems to the T&E community. They are symptomatic of the need to better integrate all of the elements in our acquisition process and in an integrated approach, reduce our cycle time.

My challenge to the test and evaluation community, both developmental and operational testers alike, is to participate constructively in this process to provide greater attention to cycle time. To meet this challenge, I believe that a cultural change is necessary -- one that is already under way in many cases and in others, one that can only begin by re-examining the fundamental role of test and evaluation in the acquisition of new military capabilities.

This leads me to the second major conclusion that I draw from Team New Zealand's success. Every member of the team focused on the bottom line -- winning the America's Cup! The driving imperative for designers, testers and the sailing crew was to win races, not to optimize their own functional area performance.

Sometimes, many of us in the acquisition business forget that our main aim in life is to field systems. It is easy to insist on endless iterations to get the best design or the perfect test plan; it is a lot harder to field systems that work and are affordable. This is the cultural change I am speaking of. Testers, like program managers, must be committed to program success. We all must be on the same integrated product team -- the one that is responsible for delivering a superior capability to the warfighter.

In some cases, my sense is that we must shift our outlook and approach from one of oversight and report to early insight. We need to make sure test and evaluation expertise is made available to the program manager early on so that we prevent problems rather than try to identify them in a "gotcha" fashion when we write a test report or at the Defense Acquisition Board review itself. We should be building in quality and excellence from the start -- not trying to inspect it in two weeks before the test program begins or the DAB meeting occurs. In my mind, this is one of the important value-added contributions that the developmental and operational test and evaluation communities must provide.

At this point, let me stress that being responsible for program success does not compromise a functional member's independent assessment role. T&E team members will still be accountable for ensuring each program has a workable approach. We are not getting rid of the independent assessment function. T&E team members must continue to perform an independent assessment and satisfy themselves that a program is executable, but I expect this to be done by engaging early and in a constructive way. We are not working constructively as an integrated team if we have to wait until the test report is written or the DAB meets to surface surprises.

I also expect stakeholder behavior -- when concerns are raised in a constructive way, they should be accompanied with workable suggestions and practical solutions. As we institutionalize this cultural change, we should remember that we're implementing a process to secure early insight, not event-driven oversight.

And this leads me to the third major conclusion that I draw from the Team New Zealand success story. They integrated modeling and simulation into their test and evaluation process to secure early insight, reduce costs and shorten the acquisition cycle time. The department's senior leadership is strongly committed to greater use of modeling and simulation, especially models that incorporate real physical underpinnings. With such models, we can actually eliminate certain tests and focus test resources on the areas where our understanding is less. In many cases, we should be conducting tests to validate our models and simulations.

Just this last August, I approved the Department of Defense's Modeling and Simulation Master Plan to improve our ability to support decision making in each of the four pillars of military capability: readiness, modernization, force structure and sustainability. With regard to the modernization pillar, the master plan establishes a framework for incorporating modeling and simulation as an integral part of our test and evaluation process. My sense is that modeling and simulation will:

q Help create developmental and operational test scenarios and improve the planning process;

q Allow dry runs of planned tests in "synthetic environments" to verify that test conditions can be evaluated cost-effectively and with sufficient realism;

q Be used to help plan the test program and select test objectives in a way that reduces the need for expensive field test assets, many test iterations and long duration tests;

q Facilitate evaluation of system performance characteristics otherwise not possible due to limited test resources, environmental restrictions and safety constraints; and

q Permit operational testers to use virtual prototypes to perform early operational assessments.

I firmly believe that the test and evaluation community is stepping up to the challenge and beginning to achieve the vision articulated in the Modeling and Simulation Master Plan. To further institutionalize this trend, I am requiring that the Simulation, Test and Evaluation Process -- let's call it STEP -- shall be an integral part of our Test and Evaluation Master Plans. This means our underlying approach will be to model first, simulate, then test and then iterate the test results back into the model.

Just as we speak now of "test, fix, test," we should now plan our development programs so that they "model, test, model." My intent is to ensure modeling and simulation truly becomes an integral part of our test and evaluation planning. We must consider all the tools and sources of information available to us in developing and evaluating the performance of our weapon systems.

As I mentioned earlier, the test and evaluation community has already begun incorporating modeling and simulation into the technical risk management process -- with impressive results -- both in improving the productivity of test and in helping to reduce cycle time.

For example, the GBU-28 was developed in less than six weeks during Desert Storm. It was done by relying almost exclusively on lethality and vulnerability modeling to design and predict the performance of the system.

In another similar case, Army testing of bridge durability -- a process which traditionally requires 12 weeks to do 3,000 crossings -- was reduced to nine weeks with a mix of actual crossings and simulation.

The AIM-7P Sea Sparrow was developed and tested using only 10 launches of the planned 50. The Navy was able to eliminate the remaining 40 flight tests using an end-game effectiveness model to predict the lethality of the missile.

At the Air Force's Arnold Engineering Development Center [Tenn.], modeling and simulation has been used in a big way to help lower the cost of testing to the customer. Average time in the PWT-16T (wind tunnel has decreased from six weeks to three to four days.

At Eglin Air Force Base [Fla.], use of the PRIMES [Preflight Integration of Munitions and Electronics Systems] ground simulation led to a 35 percent reduction in cost and a 300 percent increase in data capture during a recent flight test program of the APG-63 radar.

And finally, the Naval Air Warfare Center Aircraft Division at Patuxent River [Md.] used state-of-the-art simulation and ground test capabilities in conjunction to reduce flight test hours and costs by a third to evaluate ALQ-99 receivers and ALQ-149 communications countermeasures equipment on board the EA-6B aircraft.

The common thread in all these examples is the innovative use of modeling and simulation to make it happen better, faster, cheaper.

Before I summarize, let me offer one final thought on operational test and evaluation. Phil Coyle [DoD director of operational test and evaluation] and the operational test community have done a superb job in helping developers marry technology and employment doctrine. This is something that I think has not been given adequate emphasis in the past. We have traditionally underestimated the importance of developing the appropriate doctrine, the tactics for employment, the training, and the people who use technologically advanced systems.

When I look back to my own personal experience with the F-117 stealth fighter in the early 1980s, I believe one of the major contributions we made in that program was the effort to understand the limitations as well as the strength of the technology. As advanced and significant as the stealth technology was, we could not use it effectively until we understood the limitations (there are some) and developed mission planning tools to work around those limitations.

The result was a system that was operationally effective in Desert Storm, operating with no apparent system limitations. The real issue here is not simply developing the best technology, or even building the best equipment, but in getting this combination in the field and using it wisely -- involving operational testers early in the process must be a key part of getting this job done.

In summary, our challenge is much the same as that of Team New Zealand, and our solution path is similar -- a close-knit team working together and employing an approach that fully integrates the use of simulation into the design, test and evaluation processes.

Our bottom line is to succeed as Team New Zealand did -- to field a superior capability, affordably and in less time than our potential adversaries. Reducing acquisition cycle time is going to require a cultural change composed of the following elements:

q An integrated simulation, test and evaluation process that provides continuous insight to ensure that quality is built into programs from the start;

q An emphasis on prevention over cures, where simulation, test and evaluation is used to identify and resolve problems early; and

q A focus on overall program success, not suboptimum functional area performance. An elegant test program is meaningless if we fail to get superior capability into the hands of the warfighter.

Cultural change cannot be directed from the top. We need buy-in at the working level as well. I hope you will join me to build on the experience of Team New Zealand by working together to improve our product -- fielded equipment that works within a shortened acquisition cycle time.

 

Published for internal information use by the American Forces Information Service, a field activity of the Office of the Assistant to the Secretary of Defense (Public Affairs), Washington, D.C. Parenthetical entries are speaker/author notes; bracketed entries are editorial notes. This material is in the public domain and may be reprinted without permission. Defense Issues is available on the Internet via the World Wide Web at http://www.defenselink.mil/speeches/index.html