Articles | Home
Case Studies in Computer Ethics
East Tennessee State University
Research has shown that one of the most effective ways to introduce ethical issues into the classroom is through the discussion of case studies among peers in several courses . There are, however, several pitfalls in this methodology.
1. Sometimes the case studies are only tangentially related to computer ethics. Even though they contain significant detail they are much too broad and the ethical issues are so clouded that students cannot get hold of the subject.
2. Sometimes the case studies are so vague that there is no foundation on which to base an ethical judgement. Each listener's imagination ends up filling in their own unspoken details of the case on which they each make different decisions. The result of discussing this type of case can be so chaotic that the audience is led to think that they have just been shown the truth of ethical relativism.
3. Sometimes teachers will only use catastrophic cases such as the Challenger disaster, because they are the easiest to find and are good attention getters. The EXCLUSIVE use of this kind of case leads students to think that computer ethics is limited to only these types of cases. Since students don't plan on programming life critical systems, they tend to think that computer ethics is probably not relevant to them.
The aim of this column is two fold; first, to provide sufficiently detailed cases which avoid the three problems above and second, to conduct a public discussion of case studies in computer ethics; thereby using one of the most effective techniques of learning ethics. Our goal is for each column to contain a case study and comments on that case. Once the case has been published, the following column will contain additional responses to the case as well as the start of another case.
You are encouraged to participate in this process by responding to cases being discussed and by contributing your own case studies.
Below is a fictional case that involves several ethical issues such as the acceptable exercise of paternalism, egoism in system design, following the principle of cause no harm, honoring your contracts (promises), and meeting minimal professional standards.
"The Captain Knows BEST"
A tale of two designs
Fred Fredson, an experienced consultant, is working on a contract with the U.S. Navy to develop a critical system. The system will manage the processing of some very dangerous materials.
The exposure of two sailors to these materials and their subsequent deaths led Captain Birk to request the redesign of the system used to manage these materials. The only one in his area with any computer training, he saw a way to use computer technology to significantly reduce the dangers of this process.
Fred is one of a small number of people who have worked with the type of computer used to manage this process. During several meetings with Captain Birk, Fred has gathered the external requirements for the system. Fred then designs the system which he believes will best control the process.
Captain Birk requests that he be shown Fred's internal software design for the system. Fred, who has taken great pains to develop a system that can be proven to be highly reliable and clearly demonstrated to do what has been requested, welcomes the opportunity to show the details of his work. Fred arranges a meeting with the captain.
At the meeting, halfway into Fred's presentation, Fred is interrupted by the captain who says that the design of the software is not the way he wants it. The captain then describes how he wants the software designed. Fred is greatly concerned because he believes that there is no adequate way to test the software if it follows the captain's design.
Fred tries gently to point this out to the captain. But the subtle reasons for this are missed by the captain who accuses Fred of merely having an egotistical commitment to his own design. The meeting ends acrimoniously with the captain saying that if Fred does not agree to the new design then the contract will be cancelled.
Fred calls you and asks for your help. He relates his belief that if the contract is cancelled that the navy will not deal with this problem again until there is another death. Fred reminds you of his expertise in testing. He assures you that he designed the software so that the testing would yield a very high level of confidence that the software would be safe and would resolve the problem that led to the two deaths. He further assures you that even though you could not prove that the captain's design was wrong, Fred could not think of any tests that could demonstrate the reliability of the captain's design. Fred could provide no assurance that the captain's design was reliable and would not at a later date cause significant problems if it were adopted.
He knows that you have been concerned with ethical and legal issues and asks for your advice.
One software engineer has already responded to this case.
"I send Fred back for more information. Here's the phrase that I would focus on if Fred came for my advice:
"Fred could not conceive of any tests that could show the captain's
design was reliable..."
Could the captain conceive of such tests? Could someone else? Fred seems to be relying on gut instinct here, at least as far as the case tells us, and gut instincts can be dead wrong in testing and reliability analysis. Unless Fred could be more specific about his objections to the captain's design, I couldn't in good conscience counsel anything but further study.
Now, Fred may very well have some specific, technical objections to the captain's design. For example, Fred's design may be more robust and testable because it avoids history dependence, clearing memory to an initial state every minute. If the captain's design, for example, required infinitely long test trajectories, then Fred has a point.
If Fred can document IN A PROFESSIONAL AND TECHNICALLY SOUND MANNER the superiority of his design, and then the captain rejects it out of hand, I would advise Fred of strategies to ethically deal with the obstinate captain. However, with the information I have at the moment, I would not YET rely on Fred's reaction to the captain's decision."
Fred takes your advice and tries to explain to the Captain the inherent untestability of his design. Fred calls you back and explains that the Captain was either unable or unwilling to understand the problems with testing his design. Fred explains the problems with the Captain's design to your satisfaction. Fred further explains that Captain Birk is the only one with any computer knowledge in the division that issued the contract. He knows that you have been concerned with ethical and legal issues and again asks for your advice.
 Thomas W. Dunfee, Integrating Ethics Into The MBA Core Curriculum, The Wharton School of Business, research funded by Exxon Education Foundation 1986.
 Donald Gotterbarn, East Tennessee State University, Johnson City, Tennessee 37614-0711; firstname.lastname@example.org
 Electronic mail from Keith Miller, Sangamon State University email@example.com
Articles | Home