Home Page | Articles



Ethical Considerations in Software Engineering

Abstract

This panel considers some of the ethical issues that arise in the practice of software engineering. The panel's comments are guided by the cases presented below. Their preliminary responses, presented here, include comments on: the relation between software safety and ethics, informed consent and ethical decisions, the customer's responsibility for end product quality, internal performance standards and customer consent to inferior work, and techniques for addressing ethical issues.

Panelists:

Donald Gotterbarn, Associate
Professor of Computer Science,
East Tennessee State University
Frank Marchilena, Patriot Software
Laboratory Manager, Raytheon Missile Systems
Keith Miller, Associate
Professor of Computer Science
The College of William and Mary
Carl M. Skooglund, Ethics Director,
Texas Instruments
Gilbert J. Tumey, Ethics Advisor, Boeing Computer Services

1 Introduction

There are ethical concerns directly related to the practice of software engineering which are not merely fanciful questions or intellectual exercises. The practicing software engineer engages in a social process during software production and thereby acquires obligations to: users, clients, customers, colleagues, supervisors, the organization for which he works, and the discipline of software engineering. The ethical problems faced by the software engineer involve: the end product, the process of developing that product, and the human interactions in the development of the product.

Computer ethics, according to the mass media, seems to include every conceivable misuse of computers. Many books on computer ethics contain little more than litanies of illegal or malicious acts one can engage in with computers. These acts are sometimes described as conundrums and leave one with the impression that there can be no progress in ethics. Many of the media reports of computer disasters are really pointing at symptoms of failed professionalism.

The practicing software engineer has great potential to effect the lives of others and he has been made aware of this potential by overblown media reports. The ethical concerns of the practicing software engineer include the types of issues in the media reports. The media emphasis on these issues makes interesting reading but it misleads us about the significant ethical issues for the professional software engineer. As practicing professionals, the boundary of the software engineer's ethical concerns go beyond these `pop ethics' issues.

2 Cases

The panel will discuss ethical considerations that arise in the practice of software engineering and will use the cases below to help focus the discussion. These cases have been fictionalized to help direct the discussion toward the ethical concerns in the practice of software engineering.

2.1 The User Interface

2.1a A computer company is writing the first stage of a more efficient accounting system which will be used by the government and will save taxpayers a considerable amount of money. A software engineer is asked to design a user interface for the system. The accounting system and the interface contain all of the functionality described in the requirements. The system is installed, but the user interface is so hard to use that the complaints of the customer's staff are heard by the customer's upper level management. Because of these complaints, upper level management will not invest any more money in the development of the new accounting system and they go back to their original more expensive system.

2.1b A computer company wrote a radar system for ships which identifies incoming aircraft as friend or foe. A software engineer developed a user interface which will display needed information. This information can be used to determine if defensive action is necessary. The radar system and the interface contain all of the functionality described in the requirements. Although all of the information required was displayed, it was not displayed in a way that was adequate for them to distinguish a military attack plane from a commercial airliner. This contributed to the loss of many lives.

2.2 System Modifications

2.2a A computer company wrote a very complex system for a national bank. The system manages local checking accounts, saving accounts and loans. It also manages the bank's branches. Its effectiveness is partially responsible for the bank's success despite the failure of many similar banks. The bank becomes aware of some things it can do that will better protect is from the threat of lower profits. They request that the computer company make immediate modifications to this system and deliver it within a week. The software engineering department decides to do the work.

2.2b A computer company wrote a very complex system for an anti-ballistic missile. The system is being used successfully to shoot down incoming missiles in a current military action. The military determines that the anti-ballistic missile would protect them more effectively if it shot down incoming missiles while they were further away. They ask the computer company to make immediate modifications to the system and deliver it within a week. The software engineering department decides to do the work.

2.3 Professional Judgement

2.3a A computer company is working on a integrated control system for a national shoe manufacturer. The system will gather sales information daily from shoe stores nationwide. This information will be used by the accounting, shipping, and ordering departments to control all of the functions of this large corporation. A quality assurance software engineer suspects that the auditing functions of the system are not sufficiently tested, although they have passed all its contracted test suites. She is being pressured by her employers to sign off on the software. Her employers say they will go out of business if they do not deliver the software on time. She signs off.

2.3b A computer company is working on an experimental fighter. A quality control software engineer suspects that the flight control software is not sufficiently tested, although it has (finally) passed all its contracted test suites. She is being pressured by her employers to sign off on the software. Her employers say they will go out of business if they do not deliver the software on time. She signs off.

(This case adapted from McFarland in the May, 1990 IEEE Computer)

3 Responses

3.1 Donald Gotterbarn:

East Tennessee State University
Johnson City, Tennessee 37614
i01gbarn@etsu.bitnet

"USER" - A Four Letter Word

In most situations where ethical problems arise, there are a variety of contributing factors from the lack of knowledge of the customer to the desire for unreasonable profits. In the above cases , when we focus on those factors which are in the software engineer's control, we see a single pattern that significantly contributes to the ethical problems.

Each case in a case-pair raises similar ethical questions. The first case-pair points to problems generated by a failure to employ professional skills and knowledge about interface design and testing. The second set of cases deals with problems that may arise in modifying a complex system "on the fly." We know that any unanticipated maintenance tends to degrade a system. And we know that maintenance done quickly on a very complex system has a greater potential to produce a highly degraded (failing) system. The third set of cases addresses the failure of the ethical commitment to adhere to professional standards. In these cases, this failure is based on a non-ethical issue --economics-- which nevertheless effects our judgement.

The ethical concerns of the second case of each pair of cases(the b. cases) seems more obvious because they each involve a potential risk to human life. The question of safety is one of judgements about risk. Risk is generally defined as a measure of the probability and severity of harm to human heath and a system is considered safe if its risks are judged to be acceptable. These cases are easily recognizable as causes for ethical concern because they involve the potential for physical harm. The degree to which this potential is realized is tied to the quality of the software engineer's professional judgement.

The first item of each case pair (the a. cases) seems more difficult to deal with. They do not contain direct threats to human life. It is a mistake, however, to equate ethics in software engineering just with the threat to human life. The a. cases deal with a threat, not a threat to human life, but a threat to human well-being. Case 2.1a limits a manager's freedom to improve his business, and case 2.2a and 2.3a are threats to the continued existence of the customer's business. These threats are ethically significant, but are often minimized because they are not issues of safety. The a. cases, just like the b. cases, involve a significant element of skilled professional judgement. The events described also have the potential to negatively effect several lives. The major difference between the a. cases and the b. cases in each sets is that the b. cases have to be handled quicker. The a. and b. cases contain similar ethical issues involving professional judgement. Problems like those described in the a. cases arise when professional judgements fails or is compromised.

I think the pivotal form of professional compromise in these cases is the failure to keep our responsibility to the user in mind as we develop software artifacts. We have a responsibility to develop the artifact using the best of our skills, but we also have a responsibility to develop a usable quality product for the user which will not negatively effect their well-being. There was little concern for the user in case 1a where the contractual requirements, about which information to display, were met. The contractor probably violated no laws. He probably would not be found guilty of negligence by a court, nor do I think he be would found guilty under tort law -- dealing primary with non-criminal harms. Nevertheless, there is moral culpability because of a failure to meet his professional responsibility to the user. This moral culpability does NOT require the intent to harm the user. It merely requires the conscious decision by the practicing professional not to fulfill the professional responsibilities to the user. It is this single common failure which significantly contributes to the ethical problems.

Professional concern for the user is evidenced in our attempt to reduce threats in the software and our commitment to produce a quality product based on an implicit agreement with the user to employ our software engineering skills. Unfortunately, when we do think about the user, almost all of our attention is directed toward threats to human life. We make a mistake when we ignore threats to human well-being. It is also a mistake to minimize our standards without first considering its impact on the user.

The safety issues give us some insight into a broader ethical responsibility which is not defined by law, but which is defined by our professional standards and our realization that computing products should be designed with the welfare of the user in mind.

3.2 Frank S. Marchilena

Raytheon Company, Missile Systems
Software Laboratory Manager
50 Apple Hill Drive
Tewksbury MA 01876-0901

Case 3: Professional Judgement - Experimental fighter flight control software.

Case three has several interesting aspects that must be considered from an ethical point of view. There is the responsibility of the customer (government in this case) to provide or accept a set of test cases which will lead to a product that will perform its intended functions in a safe manner. The second responsibility belongs to the company which accepted the contract to develop the flight control software. Thirdly, there is the responsibility of the engineers who are developing the software.

The government must accept its responsibility as the customer and the project manager of advanced development contracts. The government must:

provide and/or accept responsibility for
the requirements of the system;
ensure that the contractor has the
ability to perform the contract;
ensure that the contractor performs the
contracts.

In case 3 a set of requirements have been developed and accepted by all parties involved. Likewise a set of tests have been accepted by all parties involved. It is the customer's (government's) responsibility to see that the software meets all requirements. In addition the government must provide the necessary oversight to be assured that all of the tests were performed and that the results prove that all requirements have been met and that the flight control software will perform its function.

A company is responsible for providing the climate in which an engineer can perform her/his duties in a manner which conforms with accepted professional standards. This can be accomplished in the following manner:

the company should provide a written standard of moral conduct it expects from its employees in the performance of their
assignments;
the company should provide a written set of engineering development standards which areexplicit in how a job is to be performed;
the company must provide a means by which an employee can express their "moral" disapproval for the way the company is
conducting a certain business transaction.

The establishment of an environment in which employees are aware of their moral duties and obligations is the duty of all companies.

The employee (the quality control engineer) in this case must perform to standards of the company and the contract. If these standards are not met then it is her responsibility to report the issue no matter what the consequences are.

The interesting aspect of this case is when all standards of performance have been met yet the quality control engineer still believes that the software is insufficiently tested. In this situation the quality control engineer has the responsibility to seek other professional judgement on the subject. It is her obligation to prevent the software from being used until she is satisfied that it will perform.

3.3 Keith Miller:

Dept. of Computer Science
The College of William and Mary
Williamsburg, Virginia 23185
miller@cs.wm.edu

INFORMED CONSENT and the SOFTWARE ENGINEER

A theme in these cases is the implicit question "how good is good enough?" Was the interface good enough? ("Not if they shot down an airliner.") Is there sufficient time to do a good enough job? ("I'll let you know next week.") Can this test suite ensure that the code is good enough? ("That's what it says in the contract.")

These "how-good questions" are difficult because many people, values, and circumstances are involved. A list of players in a typical software engineering project includes a buyer, a seller, a developer, and a user. (On small projects, a single person may fill more than one role.) Each of these participants has goals, responsibilities, and options. In addition to these active players, there are "passive participants:" people who will be affected by the software even though they are not actively engaged in its development or use. In the cases at hand, these passive participants include taxpayers, airplane passengers, company stockholders, civilians under missile attack, and people under the flight path of an experimental fighter.

A software engineer can escape the complexities of the how-good questions in several ways. A common escape is the black-letter-law approach, which defines the problem as someone else's: "My specifications, designs, and implementations are all subject to customer review. I do what I am told. If I have fulfilled the contract, I have fulfilled my obligations." But computing professionals know that specifications, designs, and implementations are living things, and that no contract adequately covers all professional responsibilities. We call ourselves professionals because we expect customers to pay for our judgments. When we make judgements we are influenced by our values. Decisions based on human values include an ethical dimension which exists despite any denials.

A more ingenious escape involves the "state-of-the-art" concept. A software engineer who uses generally accepted software engineering practices may take comfort in the principle that a professional is negligent only when she falls short of industry standards. Although malpractice lawyers may endorse this lowest common denominator standard, most computing professionals will, on reflection, aspire to something higher.

If a software engineer decides to meet the how-good questions head on, the computing profession gives very little support. For example, the ethical codes of professional societies encourage responsible behavior, but they do not offer practical advice on how to determine what behaviors are most responsible. Academic departments typically treat such questions in an abstract or purely technical way. How-good questions demand an approach that includes both technical details and considerations of human values; software engineers need an approach that is practical and immediate.

Classical and applied ethics focus on competing human values. Some of us think that "computer ethics" (ethical enquiry specialized to computing) can help software engineers fashion responsible, rational answers to how-good questions. The issue of informed consent has been prominent in the applied ethics of medicine, and this issue may be appropriate to software engineers as well.

The parallels between M.D.s and software engineers are striking: in both cases, a professional gains status by specialized, uncommon knowledge; the uninitiated must trust the professional for adequate information to make intelligent choices; the professional must apply abstract principles in complex, ambiguous, and unpredictable environments; the research and development of both fields is expanding at a rapid rate; and both professions demand and receive great amounts of money, both individually and as a group. Applied to software engineers, the principle of informed consent might suggest a three step process for answering a how-good question:

(1) Give sufficient information about cost and benefits to the affected parties ("attempt disclosure").

(2) Ensure that all parties comprehend the information and its significance ("verify comprehension").

(3) Devise a solution that all parties can agree to ("obtain consent").

For some high-risk, high-level decisions, it may be important to convene representatives of all affected parties and try to achieve a true consensus agreement. However, such a meeting will be impractical for many decisions. In the absence of a physical meeting, the software engineer can imagine such a meeting and attempt to develop a responsible decision by writing down the interests of all the parties and devising a solution that, at least in theory, should be acceptable to reasonable parties with these interests. The record of the resulting decision and its support can become part of the system documentation.

We have applied this model of informed consent (based in part on the writing of Rawls) to several computer ethics cases. The panel discussion will include a demonstration of applying this technique to at least one of the cases listed for this panel.

3.4 Carl M. Skooglund:

Texas Instruments
P.O. Box 655474
MS 206
Dallas, Texas 75265

CASE 1

This appears to be both a quality issue and an ethics issue. In each case, I would contest the fact that the "interface contains all of the functionality described in the requirements."

In the radar example, being unable "to distinguish a military attack plane from a commercial airliner" does not meet the requirement that the information" can be used to determine if defensive action is necessary."

In the accounting system case, the customer was unhappy because the interface was "so hard to use." Clearly, the system did not meet end user needs despite the written requirements. And it certainly was far from a "more efficient accounting system."

Before the government (customer) decides to pull the investment there seems to be some key questions they must ask. Who is the customer, the government agency that will use the new accounting system or the U.S. taxpayers who will save a "considerable amount of money?" Is the problem in using the system, a system design problem or a training issue? Did the government properly define the operating parameters of the system? Did the supplier make a substantial effort to understand end user requirements?

It seems that the government and the supplier both have an obligation to the taxpayers to answer these questions so that some lessons can be learned to either correct the situation or at least to prevent a reoccurrence.

CASE 2

If the software engineering department agrees to make modifications in a complex system, it should do so with the understanding that the time required will be sufficient to perform a thorough check out. Pressure to accelerate the work was obviously there driven by financial benefits in one instance, and the potential for saving lives in another. But a premature, poorly designed effort could clearly create unforeseen problems that might cost money and lives.

The ethical obligation of the software engineering department is to understand the requirements and the risks, to apply their best technical judgment and to candidly discuss the options with the customer. Agreeing to perform a task poorly simply because a customer requests it is not acceptable. Any organization must establish internal performance standards that supersede other demands whether the customer agrees to assume the risk or not.

CASE 3

Pressures of this type are all too frequent, and they often come down to technical judgment and gray area decisions. Fortunately, I have never been confronted with a decision that traded off a quality compromise with going out of business, but we are asked to assume that in this case. I see these issues to consider:

- Sending out a product that the designer is unsure of could potentially create far greater problems in the future. It could cost lives in the fighter example. Industry is replete with examples of companies that took expedient shortcuts on their way to bankruptcy.

- This case points out something very important. If an ethical working environment can be established that promotes candor, open discussion of problems, trust and teamwork, then hopefully a situation will not arise that pits individual against the group. Decisions on whether or not to sign off on an element of work should be made within a team environment. If it's clearly a case of an unethical group against a principled individual, then unfortunately, it sometimes comes down to a question of whether or not that individual should work for that organization. This of course is more easily said than done.

3.5 Gilbert J. Tumey:

Boeing Computer Services
P.O. Box 24346
Seattle, WA 98124-0346

As in most situations, the biggest challenge in resolving perceived ethical violations is to identify and agree on the issue to be resolved. The situations contained in the three case studies could represent or be the result of any number of issues. These issues could be identified as contractual, management, communications, or ethics related. They could also be any combination of the four.

By its very nature, ethics is not a clear matter right or wrong, black or white, or good or bad. Ethics deals with gray areas, perceptions, and the intent of the individual/entity performing the act.

In responding to the situations contained in the three case studies, the first task is to identify the issue and reduce it to its least common denominator. Working from the premise that most people are basically honest and want to do the right thing, I believe you can trace the issues associated with these cases to contractual, management, or communications issues. The issue of ethics would only exist if you consider ill intent behind the individual's actions.

Most perceptions of unethical acts fade when they come under the spotlight and individuals communicate with one another and become aware their actions may be misunderstood. These situations tend to grow and fester as the length of time increases without open dialogue between the individuals involved. Therefore, it is important to foster an environment where open communication can take place and individuals can feel confident in discussing such issues with those directing their actions before the situation grows out of proportion.

Another way of demonstrating this point is to look at it from the perspective of decision making. The business decision is viewed from two different perspectives, one being the ethical perspective and the other being the economical perspective. Ethical decisions are classified as being right or wrong, referring to indicators like trust, fairness, and honesty. Economic decisions are classified as being good or bad, referring to such indicators as profit, revenue, or return on investment. In the process of making a business decision one should strive to make right-good decisions, right from an ethics standpoint and good from an economics standpoint. The following diagram illustrates this concept.

(adapted from Ethics and Leadership,William Hitt)

TWO DIMENSIONS OF DECISION-MAKING

The choice is rather obvious when the situation falls within the `Problem' or `Win-win' quadrants, however, the number of choices can increase when dealing with a `Business' or `Ethical' dilemma. I believe, however, we can further reduce the choices in most business situations through increased dialogue. Most ethical situations can be resolved or converted into business dilemmas through communication. Defusing the situation by placing it on the table and talking about it with the parties involved.

Relating this to Case 1, I would conclude the issues to be business related unless the engineer developing the interface knew the interface was too difficult or intended to hide facts from the customer/user in order to receive acceptance. If facts were intentionally concealed, it would be a matter of ethics, the issue being honesty. As long as the engineer performing the work was responsive to the requirements, communicated and received approval for changes, and did not try to hide or camouflage the facts, I would not see it as a matter of ethics.

Relating to Case 2, I believe you would have an issue of ethics if the work was accepted knowing it could not be completed within the allotted time. This would be a issue of honesty when making the commitment. However, if the commitment was made in good faith and it was believed it could be completed within the allotted time, I would not see it as an ethical matter. If circumstances changed after the original commitment, and something happened to change the time schedule, it would become another issue to be resolved and the ethics would be dependent on how the new issue was handled. Assuming open honest communications, subsequent issues discovered and addressed would be business related or judgment issues and not considered issues of ethics.

Case 3 would appear to be more straight forward. The software either does or does not do what it is intended to do. It passes the required test or it does not pass. This assumes the results of the tests are honestly presented. There is nothing wrong with delivering a product that might have shortcomings as long as the customer is aware of what they are receiving and agree with it. This involves talking with the customer throughout the development cycle to ensure there are no surprises when it comes to the final sign off. For the individual to sign off on the software knowing there are serious flaws would be a misrepresentation of the product being delivered and a matter of ethics. I believe open and honest communication is the preferred course of action in this case. The economics of the decision being made or the `fate of the company' does not change the ethics.

In summary, ethical issues involve personal values such as honesty, fairness, and trust. Issues of business judgment involve economic indicators such as revenue, profit, production quotas, etc. I believe communication is the key to avoiding most issues or perceptions of unethical conduct in the business environment. Hence, the best insurance against unethical activities is for the company, group, or organization to foster an environment where employees feel confident in communicating their thoughts and to ask questions if they feel something is not being pursued in an ethical manner.


Home Page | Articles