Background notes:
- research analyzes mathematical, biological, or ethical problems in risk assessment, public health, or
environmental justice – especially those related to radiological, ecological, and energy-related risks
- this is a well-known summary of what is perhaps the most popular philosophical treatment of technology--an ethical evalutation
Because of the expansion of ethical concepts as a result of new technology, Shrader-Frechette suggests that those who study technology and ethics need both technical and philosophical skills. She states that philosophical questions about technology and ethics fall into one of at least 5 categories:
- conceptual and metaethical questions (e.g., definitions)
- general normative questions (e.g., human rights/duties)
- particular normative questions about specific technologies (e.g., liability)
- questions about ethical consequences of technology (e.g., civil liberties, less safe regulations)
- questions about ethical justifiability of various methods of technological assessment (e.g., analysis and methods assessment)
In the same manner as Ellul, Shrader-Frechette talks about probabilistic uncertainty--because we have limited experience with these new technologies, the probability of risk or even fatality, for example, is difficult to determine. And like Ellul, she suggests that we rely on the subjective probabilities of experts, or we should assume that all probabilities are likely (negative or positive). She warns us of an "overconfidence bias"--that there will be no serious health consequences or accidents as a result of a given technology. In other words, doubt is good.
Shrader-Frechette touches on the ethical debates of due process and risk assessment. As for due process, she states that ethicists believe that technology threatens due process rights; due process is impossible for some fatalities in that the victim cannot be compensated for a death. As for risk assessment, Shrader-Frechette explains that philosophers and policy makers argue about how much risk is acceptable through classic discussions of utilitarian vs. egalitarian ethical debates. Philosophers are divided on how to evaluate negligible risks, in particular. (For example, if a new drug or therapy helps a large population of, say, cancer victims, but a few have terrible, even fatal side effects, is the drug unethical? Do we let the individual decide, or should there be some ethical standard/guideline?)
This leads us to her last point--consent to risk. If an individual consents to a drug, a risky job, or a toxic living condition, have they truly consented? Shrader-Frechette worries that people may not be adequately informed about (or compensated for--e.g. "compensating wage differentual") the risks they take on. Obviously, this is where technical communication could get involved: in helping the public make informed decisions about their consent.