Tuesday, May 16, 1995 - 1:30 p.m.
(NOTE: Participating in this briefing were Mr. Bacon and Dr. Paul Kaminski, Under Secretary of Defense, Acquisition &Technology)
Mr. Bacon: Good afternoon.
Dr. Paul Kaminski going to start off by talking about the latest steps the Defense Department is taking to improve our acquisition policies and procedures. He'll take some questions on that when he's through making his statement.
Dr. Kaminski: Thank you, Ken.
As you all know, Secretary Perry has absolutely been committed to acquisition reform. So, too, am I committed to this program of acquisition reform. In fact, it's at the very top of my priorities that came with me undertaking this job.
On May 10th, Secretary Perry signed a letter directing a fundamental change in our acquisition oversight. We have now embarked, as a result of this change in our acquisition process, on a whole new approach to defense acquisition, fundamentally changing the way we undertake our processes in acquisition. This bold and re-engineered oversight approach and the related review process will best serve our warfighters and it will result in the conservation of public funds.
What we are doing here is institutionalizing a process in something we call an integrated product and process development, or IPPD. And using an integrated product team, or IPT, approach.
The idea behind this approach is to encourage a partnership of the involved stakeholders operating in a parallel fashion, as opposed to the old process we had in place which was a serial, often adversarial process involved among the various government and contractor organizations.
In this new approach, the user, the program manager, the program executive officer, the service component staff, the DoD staff and related decisionmakers, and the contractor involved will all share ownership in their programs, and they'll have a stake in making the program successful.
What I'd like to do here is describe by example what I mean here by this IPT approach. Let me contrast it for you, if I may. The traditional approach that we used to use in the past is one in which we had late senior service and OSD program involvement. This process would begin to spin up as we began to approach a major acquisition milestone; and about six months or so before the major milestone decision, we would begin the functional review of documentations supporting the program and an end-of-cycle program audit that very often resulted in an after-the-fact program critique -- that is, finding problems in the programs.
Building up to this was a process of numerous sequential reviews and pre-briefs leading to a decision. They involved staff reviews with the program executive officer, service staff reviews, service acquisition level decision review, our OSD systems committee review. Finally, a Defense Acquisition Board readiness meeting and then a full Defense Acquisition Board meeting. Program managers who have been through this process sometimes would be subjected to 40 or 50 different briefings on their programs going through this mill.
This new IPT approach is different in that it involves early Service and OSD staff involvement at the start of the program--three to four years before we've gotten to this milestone decision. It involves teaming with the program manager and the program executive officer to develop a quality program strategy and plan. It involves a joint determination of the program review and milestone decision requirements; and a joint determination of the functional IPT requirements and the composition of that team; And a joint determination of the required decision documents--as opposed to a one-size-fits-all approach: that is, the same stamp-it-out approach for every acquisition program that we're taking. And it involves early and joint issue identification and resolution as opposed to OSD finding fault in the late stages of the program.
The idea behind this overarching IPT approach is to have Service and OSD staff working together to identify and resolve issues early in the program.
I think the result of this process will be to provide the best possible equipment to our warfighters in a more efficient and cost effective manner. Again, the idea is early insight and proactive involvement by all members of the government contractor team. The focus here is one of what I would describe is, one, placing an emphasis on early insight as opposed to after-the-fact oversight.
The intent here is to emulate the best commercial practices using very similar IPPDs and IPT approaches to reduce the government decision cycle times.
I've been keeping a little book on our cycle times here, comparing results in 1994 with what we've been seeing as we've begun to implement this program in 1995. One thing I've kept book on is our acquisition decision memos--the time required from the Defense Acquisition Board review until the memo is signed to implement the program. In 1994, the average period between those two events was 23 days. In 1995, the average has been two days.
The idea of resolving the problems in process. Of 13 of the Defense Acquisition Boards that have been scheduled in 1995, only eight were convened. The remainder weren't needed because all of the problems were resolved at the readiness meeting and a formal Defense Acquisition Board review was not necessary.
In eliminating this one-size-fits-all approach to acquisition, we're also seeing a significant reduction in the documentation that is required. Program managers will only prepare those documents that are needed in terms of being required by law or are pertinent to the decision that's required.
I have one graphic illustrating this on one of the earlier programs that we put through this process. This is a space-based infrared system. This is a system which had previous incarnations as the FUSE program. It is a replacement for our DSP program.
What I've indicated here is another illustration of the savings in preparation time--phenomenal for this kind of program in the past, being six months before the decision milestone. The experience on this one being two months. Four months saved the process.
The change in the one-size-fits-all documentation, typical documentation for a program of this size would be in excess of 1,000 pages. The documentation required for this one was 47 pages.
The results that have been obtained in terms of instituting this process are the results of the work of a dedicated and empowered team of people who for three months worked day in and day out to develop a comprehensive set of 33 recommendations and implementation plans. That effort was led by General John Caldwell. This team worked together and, as they delivered their reports for implementation, all of our services and components under the guidance of a terrific management team which included my Principal Deputy Noel Longuemere, my Deputy for Acquisition Reform Colleen Preston; the Service Acquisition Executives -- Gil Decker from the Army; Darleen Druyum from the Air Force; and I would note with a significant degree of sadness the absence of Mr. Clark Fiester--the late Clark Fiester--who was instrumental in representing the Air Force in these activities; and also recognize Admiral Bill Bowes sitting in for Mrs. Nora Slatkin who, as you know, now has a new assignment. Nora was also very instrumental in this activity.
I'd also like to introduce Admiral Len Vincent. From the Defense Contract Management Command associated with implementation of much of the process. George Schnecter, Dr. Schnecter has been the director of our Conventional System Committee, key in the acquisition process. Mr. Tony Valletta who has been the Director of our C3I Acquisition Committee. And Dr. Spiros Pallas, our principal deputy director for Tactical and Strategic Systems.
This has been a team effort, starting from the process action team that made these 33 fundamental recommendations for change, and then this group working as a team that I just described implementing this program and modifying our process in a fundamental way. As I said, we've been at this in a small experimental way during the beginning part of 1995. The whole process that I've now described has been implemented following Secretary Perry's direction.
I would be happy to take any questions directed either to me or to any members of our team.
Q: This involves, I take it, military planners, DoD officials, contractors, virtually everybody, people who will use the weapons, from the point that someone even considers the weapon. Is that what we're talking about here?
A: That's right.
Q: So that you'll have fewer problems as you go along, right?
Q: Why wasn't this done before? Why wasn't this done all along?
A: It seems like such a wise thing to do, doesn't it? But the system that evolved over a period of time involved a system of setting up levels of review in which I think the fundamental difference here is one very similar to dealing with issues of quality. Our approaches in the dark ages in years past was to inspect quality in. You had a team do a development or design or a component, and then a series of inspectors who would look at it after the fact and say, "Did it do what it was supposed to?"
We, in effect, had a system here in the DoD doing the same thing--looking to inspect quality in; doing a series of reviews at each of these levels that I've repeated within the Service; and, when those inspections were done, it would come to OSD for another set of inspections. Often the challenge was, "Could one find a fault?"
This is a different approach. This is building quality in from the start with an integrated team involving all the stakeholders.
Q: Does this mean there will be less auditors, less GAOs, less oversight on the factory floor? And is this a way that OSD can assert control of a system very early on, eventually to the point where you don't need service acquisition people?
A: I think this will lead to less acquisition oversight people at each of the levels you described. I wouldn't really describe this, though, as OSD control. I would describe this as a stakeholder arrangement. The incentives I want to have in place for the OSD staff that's supporting me, I'm not in a mode here wanting to give points for somebody who finds a problem very late in the development. Points are earned here for being part of a team to recognize the problems early on and to be constructive and proactive in offering solutions for the problem to be implemented.
So it's a difference, again, between building quality in and inspecting it after the fact. It should involve less inspectors.
Q: We have seen a pattern of criminal fraud on the part of defense contractors, major subcontractors over the years. We've seen the best intentions end up below spec, overweight, underclassed, and overpriced. How do you deal with this inherent competition between the customer who wants the program, and the contractor who wants to maximize its profits? And their very roughshod way of dealing with the Defense Department in the past?
A: I don't see a fundamental conflict between making a profit and delivering an effective system. In fact I think at some times the constraints we have put in place have created problems in both. So those two are not fundamentally at odds. If people have a common understanding of what it is we need to develop--are on the same sheet of music, communicating to each other about the need is--I don't see a fundamental problem on that score. I think a good piece of our problem has been in miscommunication. I don't think this process by itself does anything different to deal with someone who intends to be fraudulent.
Q: You would have avoided the C-17 if you'd had this program?
A: I think we would have had a much more sound foundation on the C-17 in terms of having all the stakeholders in the program involved--rowing in the same boat.
Q: Did you consider banning contractors who have been convicted of criminal fraud two times in the past five years from even participating in defense contracts?
A: I think that's really a separate issue. We have procedures in place for debarment, and certainly those procedures are procedures that I would encourage enforcing when debarment is appropriate. The specific number of offenses I think is too narrow. It depends upon the nature and the degree.
Q: Surely you still see the need, though, for the preservation of some sort arms-length of oversight, don't you? What would that be and how does that fit in...
A: There's nothing in this system that prevents me from getting independent review from members of my staff who are participating on a team. They still have the responsibility to provide that review. I still have the responsibility as the Defense Acquisition Executive to undertake, as do each of the Service Acquisition Executives. Those responsibilities haven't change.
Q: If the OSD member who is on the team, wouldn't they have some sort of personal stake in trying to prevent critical outside oversight?
A: They have a stake in trying to put forward the best solution to the program, but they also have a responsibility, when that is not a workable entity, to come back and report that--that they've made best efforts and there is not a path ahead on this program to reach the objective.
Q: You mentioned there's 33 recommendations in the task report. Is there anything you didn't implement? And why?
A: Yes, there are several implementations--several recommendations--that we didn't implement in the form that was recommended. The way we arrived at that is by operating with this team that you see sitting here across the board--the fundamental stakeholders in this process debating as to whether the recommendation provided to us would have achieved the objective that we were looking for. So we did not implement to the letter every one of the recommendations made.
Q: Does this suggest that you will give weight to picking contractors who prove to be more trustworthy and perhaps more efficient in the past in giving contracts? Does this mean that once a contractor begins building a weapon, you will use less oversight? Keep less of a tight watch on them?
A: I think the way I would describe this is this ought to have fewer inspectors in the system. Certainly past performance is going to be a factor used in future awards.
Q: When you talk about major systems, where is your line there? Is that just a cost decision? Pricing?
A: Our major program definitions operate on the base of the cost of the system. But this process that I'm talking about, I would expect the services to be using--employing pieces of this same integrated product and process team approach, even on the non-major systems. The directive is for major systems, but I think the idea we're talking about... And I will let the Service Secretaries speak for themselves. Gil, would you like to address that question?
Decker: We've had a parallel effort going in the Army. I don't know if you know the hierarchy of things, but the Army DAB equivalent process is called an ASARC -- Army Systems Acquisition Review Committee which I co-chair with the Vice Chief. It's had about the same voluminous documentation and kind of history of arms-length review and documentation that Dr. Kaminski described for the DAB.
Depending on the nature of the program, it's either delegated to the Army or Navy or Air Force, and managed within there, in which I am the milestone authority; or it's reviewed at the OSD level. That is a matter of size, of scope and things.
When a DAB-level program came up--you heard him say all these things--that was multiplied all over again at the ASARC that precede the DAB, that preceded the decision. So we've had a PAT team in the Army that's just concluding and we've tried it out on some ASARC teams, which has streamlined, uses an IPT approach, reduces demanded documentation. The first program we tried it on--and we ended up not even having to have the actual ASARC--we just reviewed the documents that looked good. It was the JSTARS ground station module which the Army is a major contracting element on, and it worked very well.
Maybe the last comment I'd like to make... I appreciate the thoughts that some of you maybe have from the questions that how do we protect the taxpayers' interests and make sure that we have good performance and minimal malfeasance. I would submit the following. There are truly laws on the books, and in my opinion, and I come from the business world and I did a lot of defense contracting before I was asked to take this job, my feeling is that there are plenty of laws on the books, and if a contractor is found to be guilty of malfeasance of any kind they should be punished with inordinant severity. It's an insult to me as a former member of that community when somebody is found guilty of outright fraud.
We have to distinguish between fraud and screw-ups. We will make mistakes. We will have programs that don't succeed. And our cost of thinking we're preventing the fraud, waste, and abuse with all the oversight, is far greater than the minimum that's going to occur naturally and statistically, because some are crooked and some aren't. I think you'll see improvement in the number of attempts to end-run the system.
Kaminski: Adding six levels of review isn't needed as a deterrent to stop fraud, waste, and abuse.
Decker: So we're permeating down through the Services the same kind of scope that we've studied out through this PAT.
Q: The point is, these companies who are convicted of criminal fraud are back in business in this building the next day. There's no restrictions, there's no ban on these companies. You can get caught as many times as you want. Who's protecting the taxpayer interests from these people?
Decker: My answer to that is not to debate with you. The people that usually are found guilty go to jail--as they rightfully should--or they're fined severely or whatever the appropriate legal penalty is. You'll find, I believe, that companies that are genuinely found guilty of criminal malfeasance see a massive change of management. If the company has fixed its problems and its quality, then that's okay. There's a debarment process.
That's a tough issue, and I'm not quarreling with you, but I think bad management needs to be gotten rid of. Then you have to adjudge, "Is the rest of the company solid and have they overhauled and gotten their act together?" I think we do a pretty good job of that. I like to think we do.
Q: Probably a more endemic problem in the defense acquisition process than fraud has been the concept that no program manager ever wants to see his program fail and there's constant effort to hide the fact that they are failing. By building in this team concept, you're almost ingraining that process. Everybody now from the beginning is part of that team, and the team mentality is not to let the program fail, or not to let it appear to be failing, but it would seem to be enriched by the process other than prevented.
Kaminski: Actually, I think quite the opposite. Various members of this team are involved in other programs as well, so they have a broader responsibility than a commitment to a particular program. They're stakeholders representing various functional interests. I actually think this process is better than a sequential hierarchical system where a program manager is not involved in the same integrated process, therefore can keep to him or herself information about the program. It's much more likely to come out in the early phases with this kind of an integrated team, so I think there's room for much better communication and information flow. Much harder to hide information about problems in this kind of an environment. I think the incentives are to get the problems out and get them worked on, rather than try to hide them as we move up through the review process.
Q: Any possibility in this system of earlier termination of programs that are found to be going in the wrong direction?
A: Yes, I think there is. I also think there's another really important issue here in making these trades on costs and performance as a part of our program process. Making judgments about 80 percent solutions that cost considerably less; and doing that on an informed team basis with the user, with the program manager, and with the OSD stakeholders involved, rather than building rigidly to requirements with a capital "R" and taking whatever cost falls out. That's an approach we're working systematically through that.
Q: Increasingly the Pentagon is spending money on modifications rather than new systems. Will this approach also apply to mods, like the B-1 upgrade is a multi-billion dollar program. Will it apply to that?
A: This approach will apply to mods. It will probably have a smaller impact on mods than it will systems undertaken from the start, but I think it will have some impact on mods.
Q: At what milestone does this actually get implemented? When do you create the IPT?
A: We would start the IPT right at the construction of the new program, moving from milestone zero to milestone one.
Q: How does this affect programs that are ongoing already?
A: As programs that are ongoing come up for reviews, we've been implementing this process. Those statistics that I quoted to you actually result from our early beginnings of implementing this process at the beginning of 1995. Actually, I think we began to first do some things maybe even at the end of '94.
Q: Will you implement this for Seawolf?
A: We have not come to a major program review yet in which this process is working. We intend to implement it for Seawolf.
Q: Are there going to need to be further changes in the internal structure of OSD and people's job descriptions, for example? Mr. Valletta and Mr. Schnecter are committee chairmen on the DAB. I assume they won't be needed any more under this process. I mean, not them personally, but their jobs won't be needed under the IPT process. What further changes...
A: The committee chairman responsibility will be changing, but in fact you will see the same people involved chairing some of the IPTs that are involved in the process. If I may... Tony, might I invite you to come on up here and give us a little bit of your views as to how this process will work from your perspective?
Valletta: Basically we have a new term called the overarching IPT leader. George and I have called ourselves the OIL which is going to basically help the program come through the process -- the OIL, which is the overarching IPT Leader. We view ourselves as the facilitators, as we did in our previous jobs as the committee chairs, to help the Services, help the project managers, and the rest of the OSD staff. We do have a number of the staff players involved from the testers to the economic analysis people and the rest of the agents that are looking for their particular functional areas.
For the first time we come out of the building, we actually work way up front in the program, as Dr. Kaminski has said, as a team with the service staff and with the PM staff. So basically my job and Dr. Schnecter's job are to facilitate that entire operation and as Dr. Kaminski said, we look for all of the areas that we need to find up front in the program. It also means the problem areas that we bring to Dr. Kaminski's attention, to include possible termination of the program, if in fact in my judgment and the SAE's judgment the program needs those kinds of actions. But so far the early involvement has proved nothing but positive effects. We have really fixed the problems as a team, and as Dr. Kaminski said, we have avoided DABs for the last three programs. We have done paper reviews for all of these programs.
Kaminski: We got to the DAB readiness review and there were no issues. I think somebody said it very well, though, in our earlier discussions. Maybe the issue is that what we're doing here in this IPT approach isn't so surprising. It seems like a pretty sensible sensible thing to do. Perhaps the surprising thing is the way we've been going about this for years in the past.
Q: We're always astounded by the lack of common sense.
Press: Thank you.