-
Home
- Engineering ethics modules for
- Social and global impacts of
- Responsible choice for appropriate
Interpretive flexibility relies on an imaginative attitude that Steven Winter terms “transperspectivity.” Designers must first “unravel or trace back the strands by which our constructions weave our world together” then “imagine how the world might be constructed differently.” The capabilities approach compliments social construction of technology in that it asks how background social conditions can be changed to facilitate the realization of capabilities. Instead of forcing women to conform to inappropriate cockpit design, we ask how cockpit design can be reworked to facilitate the realization of the capability of women to fly planes.
Technological determinism
Technological Determinism is the opposite of social construction. Where the position of social construction argues that society constructs or determines technology, the position of technological determinism argues that technology constructs or determines the dominant forms of social interaction. While Langdon Winner is not a technological determinist, he lays out a terminology that dramatizes how technologies can cease to function as tools and, instead, take on the role of centers of concentrated power that dictate social forms and relations. Technologies create their own imperatives, that is, they assert their requirements as needs that demand fulfillment if we are to continue their functioning. These technological imperatives create the need for reverse adaptations. Instead of our designing and modifying technologies to fit our needs (technologies serve us), we set aside our needs and adapt ourselves to serving the requirements of complex technologies (we serve technologies). Winner discusses the technological imperative and reverse adaptability in Autonomous Technology. Larry Hickman provides an excellent summary of Winner's approach in John Dewey's Pragmatic Technology.
Questions for assessing the appropriateness of a technology
-
Does the technology in question play the role of a conversion factor that changes capabilities into active functionings? (Conversion factors are a bit like resources or means and can be personal, social, or environmental: see Robeyns) Review the ten capabilities outlined by Nussbaum. Does the technology in question help to realize a capability in the STS of your case? Which one? How? On the other side, does the technology threaten to thwart the realization of a capability? Which one? How?
-
Does the technology in question embrace simplicity and avoid (manifest or latent) complexity? The more complex a technology, the harder it is to control. As technologies become more complex they take on lives of their own. So one way of approaching this question is to assess the complexity of technology in terms of the background STS. Manifest complexity lies in the complexity that is obvious. Latent complexity is a negative factor in the appropriateness of a technology because latent complexity can often lead to unpredictable breakdowns and accidents.
-
Does the technology embody a decentralized approach to control, one that disperses control over many localized centers or does it telescope control in one, centralized powerful locale? Amish communities do not reject electricity per se but refrain from hooking up to power grids maintained by large public utilities in part because of this issue. As a general rule, a technology is more appropriate when it can be instantiated and managed through decentralized points of control rather than through large, bureaucratic, authoritarian centralized points of control and management. Windmills would be preferable on this criterion to nuclear reactors because the latter are subject to catastrophic failures; this requires the exercise of tight managerial controls better brought through centralized and concentrated points of control and management.
-
Does the technology realize or protect values (or resolve value conflicts) in such a way as to put the STS on a value-positive trajectory? This, more than any of the other criteria of technological choice, requires holistic thinking. Bringing a technology into a STS should require mutual adjustment. How will the STS have to be adjusted to incorporate the technology with the minimum number of value issues (value vulnerabilities or value conflicts)? Will these adjustments place the STS on a value-positive trajectory? On the other hand, how malleable is the technology? (This is something you have already begun to answer as you looked at the technology’s complexity and centralization.) If malleable, it can be adopted to the surrounding STS. If not, then the problem of reverse adaptation arises.
-
Does the technology provide for a just distribution of relevant costs and benefits? Technologies create benefits and costs. Utilitarianism argues that the only relevant factor is the ratio of benefits to costs; if benefits are maximized and costs minimized, the utilitarianism enjoins that we adopt the technology. This criteria provides an important caveat; not must benefits be maximized and costs minimized but benefits and costs must be broadly and equitably distributed among the stakeholders. Net benefit maximization often stands side by side with massive inequities in the distribution of costs and benefits; everybody benefits from cheaper gas prices made possible by the refinery located near a lower class neighborhood. But those living next to the refinery bear the brunt of the costs if the gas is made cheap by sacrificing pollution controls.
Source:
OpenStax, Engineering ethics modules for ethics across the curriculum. OpenStax CNX. Oct 08, 2012 Download for free at http://legacy.cnx.org/content/col10552/1.3
Google Play and the Google Play logo are trademarks of Google Inc.