Project Management: Project and Ethics



Although not a software project, the team who worked on the Human Genome Project made sound ethical choices. First, their work was always open to public scrutiny, inspection and comment. This way, any mistakes overlooked by the project team may be caught by an outsider. There was no room for ego on the team. Second, the team was comprised of members from all over the world. If this project’s effect was going to be planetary (and it is!), then it is only ethical to include a worldwide panel of team members. Lastly, upon completion, the results of the project were made available to the world free of charge. The project was completed way ahead of schedule and it always remained true to its sponsors and to its own team members.


Below is an example of where poor ethical behavior was displayed. It is regarding the Therac 25 project, claimed by many to be the worst software engineering project disaster to date. (I wrote this article earlier for a software engineering class):

The patient was receiving [radiation] treatment on his face. When the [accidental] overdose was administered he yelled and then began to moan. He received neurological damage, fell into a coma and died only 3 weeks later.”

This is an excerpt from an excellent summary article at http://www.voguelph.ca/~tgallagh/. The Therac-25 case has been called – by many – the worst software engineering glitch in history. Several people were seriously hurt and six people died as a result of faulty code.

The Therac-25, a medical linear accelerator, was engineered by the Canadian government-based company, Atomic Energy Commission Limited (ACEL) and a French company called CGR. Its purpose was for the use in cancer treatment via radiation therapy. The T-25 was based upon previous models, Therac-6 and Therac-20. However, the T-25 was completely computer-controlled where the previous models weren’t. The units were used in both The US and Canada. The accidents occurred between 1985 and 1987.

The T-25 was buggy to start with. Operators complaied that the units would mysteriously lock up. The first accident occurred in 1985, two years after the model was safety-tested and released out onto the market by ACEL. This first incident, due to overdose, resulted in serious injury but was not fatal. ACEL, however, denied any design faults of their product. The injured patient sued ACEL and the Marietta Oncology Center where the therapy was performed. The case was settled out of court. The sad thing is that the FDA did not hear about this first incident until after two incidents that occurred in Tyler, Texas nine months later. This “cover-up” was purely on the part of ACEL.

The second incident, in ACEL’s native Canada, resulted in death. This time, ACEL responded by making some minor hardware and programming changes to the T-25. They were all WRONG solutions to the problem. Further, some of the other non-life-threatening bugs that the operators were complaining about were completely ignored. Unfortunately, ACEL did not take any formal action until the fifth incident. Even worse, the problem-isolation effort was made by a non-ACEL employee; a dedicated hospital physicist, Fritz Hager, at the East Texas Oncology Center. Hager was able to reproduce the malfunction that was the root of the problem.

ACEL used an extremely poor design strategy for the T-25. Further, ACEL showed a complete and utter disrespect to human life with respect to the manner in which they handled the problems of the T-25. Here’s a summary of some of these problems:

(1) there was only ONE person programming the code; and he did most of the testing. This person resigned shortly after the incidents started occurring. His code was poorly documented.

(2) the machine was tested for only 2700 hours – many more hours of testing are needed for safety-critical systems.

(3) the T-25 was tested as a whole system rather than separate modules. Modular testing reveals many more problems.

(4) ACEL should have reacted seriously to the problem after the FIRST incident instead of waiting until the 5th incident. Even when they formally reacted, they relied on a hospital physicist to isolate the problem.

Nancy Leveson and Clark S. Turner present an extremely thorough report of the T-25 disaster in a 1993 issue of IEEE COMPUTER. A reprint of this can be found here. In the second paragraph of this report, they quote Frank Houston as saying, "A significant amount of software projects for life-critical systems comes from small firms, especially in the medical device industry; firms that fit the profile of those resistant to or uninformed of the principles of either system safety or software engineering." This makes the case against ACEL even more compelling. ACEL is a crown division of the Canadian government – the Therac-25 project was heavily funded and resourced. In light of this, there should have been no excuse for failure. The politics of this case are undeniable. If ACEL was a non-government-related firm, swift and timely action would have been taken by the a government agency to rectify the problem.

In conclusion, I think honesty is the best policy. The project manager and his/her team must be honest with their clients, with their company, amongst each other and with themselves, and all who may come in contact with their product. The ACEL team clearly violated these principles.