|There's more than one way to do things|
Comment onby gods
|on Feb 11, 2000 at 00:06 UTC||Need Help??|
I think I understand what you're talking about. At the roots of what I think you're saying, I would say that I have no particular contest.
Though I interpret what you're saying as being one of trying to get me to consider the issue from the 'buyer's' (or customer's) point of view: I am the buyer/customer of the systems and my job, as Chief Systems Engineer is to advise our Program Managers who hold the checkbooks, on what and how we expect our suppliers to produce the systems we are responsible for. I am not meaning to be at all critical of what you were saying; I do think your points are largely correct. I just want to be sure that you understand that it seemed like you may have presumed that I work for a supplier and might not appreciate the customer's point of view.
But to the excellent points you presented, my problem is that all of those analyses and evaluations and considerations that you spoke of actually rarely occur. Often because we can't find anyone that can actually figure out what went wrong since when we loose an asset it isn't around for us to do a post mortem on...it's either in the ocean, or blown to bits when destroyed during launch, or is floating around in space). In addition, it seems, we have too few 'experts' to do all the things that you pointed out.
So, failures happen; but way too frequently noone trys to (or can) really get to the 'root cause'. Instead, tiger teams of people try to imagine what 'might' have happened, and then try to determine what testing would have proven or disproven that such a hypothesized cause would occur. Those tests then become: first, required tests for all future projects and second, folklore that noone seems to remember why they were even dreamed up.
For what it's worth, the same pathology shows up in most every aspect of our systems productions (concept formation, system requirements production and derivation, system design and production, testing, and on-orbit operations). My focus was on testing because, for our particular work, it is the area most malignantly infected with the pathology: the tendancy to substitute 'do what we always do' for 'thinking'.
With respect to your comments about 'cost-benefits' and those types things...they are, for me, certainly nobel goals and SHOULD be what is being considered. But I know of no real case where anyone has done any such analyses. They seem to just try to reason that such analyses SHOULD validate the mass of processes, practices, and policies...but noone (to my knowledge) seems to ever do so. What I see is that they follow their 'SHOULD validate' with a very loud proclamations that almost quote what you wrote. In your case I believe it was meant with true constructive observation; the proclamation litanies, on the other hand, seem all too often to be just a smoke screen (hence, I suppose, the cause for part of my initial reaction to what you had written).
In fact, when I talked about what we've (over the past 10 years or so) hae been doing to try to change it (i.e., to try 'thinking' instead of just 'doing it because that's what we've always done' has resulted in producing our last 10 satellites with demonstrated reliability...based upon number of systems fielded compared to the number that have failed...that matches or exceeds the last 10 to 12 produced in the 'traditional way'. And we have produced them for total costs that are between 1/3 and 1/4 of the 'traditional' systems' costs and we have produced them on on a schedule that is getting quicker (rather than the 'traditionalists' whose schedules are creaping towards longer) typically 2-3 years compared to the 5-7 years of the 'traditional ones.'
So based upon that evidence (which I would, of course, consider somewhat anecdotal...I think I misspelled that, sorry...since the analyses that you suggested have not been done on our systems either) I would have to say that at face value the analyses you suggested would likely end up arguing against the 'traditional' approach.
Of course your observation that in my business the loss of a system is very high...many tens of millions of dollars...is absolutely true; so you are definitely correct in saying that the amount, focus, and considerations for testing are heavily influenced by that consideration.
But at the end of the day, no matter what the costs involved, I think we should still be working to be sure that we put 'thought' into our testing and be frugal but responsible and accountable for ensuring that the testing that we do is what is really needed...not just because thats-how-we've-always-done-it.
That's my thesis...my story...and I'm sticking to it.;)
ack Albuquerque, NM