Notes for Deborah Johnson Computer Ethics:Analyzing Information Technology, Fourth Edition

Key concepts: cocreation, computer ethics, conceptual muddle, information technology, metaquestion, policy vacuum, science and technology studies, sociotechnical computer ethics, sociotechnical systems perspective, value sensitive design.


Related theorists: Mark Bauerlein, Paul Edwards, David Golumbia, N. Katherine Hayles, Thomas Hughes, Jaron Lanier, Lawrence Lessig, Bruno Latour, Keith Miller, James Moor, Helen Nissenbaum, Jonathan Sterne, Lynn White, Langdon Winner.


PREFACE

Johnson saw a task as early philosopher of computer ethics to distinguish hype from serious analyses, using strategy of identifying what remained the same versus what really changed in society as well as taking into account multidirectional relationship between technology and society.

(vi) My job—so it seemed—was to sort out the hype from the serious analyses. One of my strategies was to identify and emphasize that which remained the same—aspects of society that were unaffected or being reinforced and solidified. . . . Changes of many different kinds have occurred and these changes have been influenced by many factors, only one of which is the development and widespread use of computers, information technology, and the Internet. . . . Computers and information technology have shaped the world of today but social, political, economic, and cultural conditions have also shaped the development of computers, information technology, and the Internet. This edition of Computer Ethics attempts to take into account the complex, multidirectional relationship between technology and society.
(vi) We tried with this new edition to rethink the field of computer ethics so as to capture the powerful role that computers and information technology now play in so many aspects of everyday life. . . . In the end, we developed a structure that, we believe, serves as a framework for addressing an array of issues, some of which we have addressed extensively, others we have treated in a cursory fashion, and yet others we have not even mentioned.

Additional voice of computer scientist Keith Miller fills gap Johnson recognized in her previous scholarship, balancing desires to protect integrity of computer science and provide accessible details to less technically sophisticated readers.

Adding voice of a computer scientist to less technical humanities presentation helps Johnson avoid indulgence of rationalized ignorance that opens her approach to computer ethics to similar criticisms Bauerlein makes of adolescents missing connection to tradition.

(vi) Perhaps most importantly, this edition includes the voice of Keith Miller. Keith and I met when he first began teaching a course on computer ethics over twenty years ago. As one of the first computer scientists to take on the responsibilities of such a course, Keith is a pioneer and veteran of the field. . . . At times it seemed that Keith was protecting the interests of computer science students and teachers, and I was concerned about accessibility to the less technically sophisticated readers.

Information technology replaces computer for rest of book following first chapter.

(vii) Although the field continues to be called “computer ethics,” the attention of computer ethicists has expanded to include a much broader range of technologies more often now referred to as information technology. . . . Because Chapter 1 focuses on the field and its goals, methods, and mission, we stayed with the term “computer ethics” for that chapter. After Chapter 1 and throughout the rest of the book, we use the phrase “information technology” or the acronym “IT.”

New theoretical approach based on science and technology studies; still using provocative scenarios targeted at college-age students.

(vii) This edition includes a new theoretical approach. We have incorporated concepts and insights from the field of science and technology studies (STS). STS theories frame technology as sociotechnical systems and this, in turn, brings the connection between ethics and technoloyg into sharper perspective.
(vii) As in earlier editions, all but one of the chapters begins with a set of scenarios designed to draw readers into the chapter topic. . . . In our selection of scenario topics we have been mindful of the experiences of college-age students.

Moves away from uniqueness and address to computing professionals to how computer ethics and its encompassing IT fits within cultural milieu of information societies, late capitalism, digital order, and thus the new methodology of sociotechnical computer ethics, consonant with Latour, Sterne, many other theorists relevant to texts and technology studies.

(vii) However, in this edition we have moved somewhat away from theorizing about the uniqueness of computer ethical issues and have, instead, framed the issues as part of a broader enterprise of understanding the connections between ethics and technology. . . . The overarching theme in Chapter 3 is that information societies are constituted with, and configured around, information technology, and this means that ethical issues have distinctive characteristics. The overarching theme of Chapter 6 is what we call “digital order.” The chapter focuses on several different issues that affect activities on the Internet. Order, we emphasize, is created by law, markets, social norms, and architecture. The chapter on professional ethics has been moved to the end of the book. Computer science students may well want to read this chapter early on, but it no longer serves as the motivation for subsequent chapters.

Interesting advertisement serving as logotropos/hyperlink in book form to equivalent of dynamically determined destination, such as from instantaneous search results.

(vii) As the book goes to press, we have plans for a website to supplement the material presented in the book. The website will include additional scenarios, podcast discussions, links to other sites, and more. It should be available and easy to find by the time the book is published.


ACKNOWLEDGEMENTS

ABOUT THE AUTHORS


Chapter 1
Introduction to Sociotechnical Computer Ethics
SCENARIOS
Scenario 1.1 A Virtual Rape
Background

Each chapter begins with a set of scenarios with embedded case studies; the LambdaMOO virtual rape story, while dated, remains an exemplar.

(2) Today there are many more games of this kind with significantly enhanced capabilities. Nevertheless, LambdaMOO remains an intriguing exemplar of the complicated conceptual and ethical issues that arise around computers and information technology.

INTRODUCTION: WHY COMPUTER ETHICS?

Familiar call for studying ethical implications of IT choices to help steer development of future technologies.

(5) The scenarios suggest that living in a world constituted in part by computers may involve distinctive and especially challenging ethical issues.
(5) If we have any hope of steering the development of future technologies in a direction that is good for humanity, that hope lies in understanding the social and ethical implications of our choices about IT.

The why computer ethics metaquestion involves clusters of issues surrounding putative uniqueness of situations created by information technologies with respect to traditional ethical approaches; propose more general perspective connecting ethics and technology than prior focus on uniqueness of new computing technologies, which Johnson calls the standard account.

(5) At first glance, it seems that IT creates situations in which common or prevailing moral rules and principles don't seem to apply nor seem helpful in figuring out what one should do.
(6) They have asked whether the issues are so different that new moral theories are needed, or whether traditional theories might be extended to apply. As well, they have considered whether a new kind of methodology is needed for the field. We shall refer to this cluster of issues as the “why computer ethics?” question.
(7) The “why computer ethics?” question is what we might characterize as a
metaquestion, a question about how we are asking our questions.
(7) The answer we will propose recommends that we keep an eye on the connection between ethics and technology in general as the backdrop—the framework—in which computer ethics issues can best be understood.

THE STANDARD ACCOUNT
New Possibilities, a Vacuum of Policies, Conceptual Muddles

Standard account introduced by James Moor that new possibilities created by computers raise ethical questions.

(7) Computer ethicists seem to accept the general parameters of an account that James Moor provided in a 1985 article entitled, “What is Computer Ethics?” We will refer to this account as the standard account. According to Moor, computers create new possibilities, new opportunities for human action.
(8) According to the standard account, these
new possibilities give rise to ethical questions. . . . Some of these questions have been resolved (or, at least, concern has waned); some have been addressed by law; others continue to be controversial.

Gains and losses for different groups of individuals suggests need for tests, though adjusting focus to ethical perspective; compare to Bijker and Hughes, Latour, Boltanski and Chiapello.

(8) Of course, part of the answer is simply that the new possibilities are “new.” But part of the answer is also that new possibilities are not always or necessarily good (or purely good). They can affect different individuals differently. They can be disruptive and threatening to the status quo.
(8) To be sure, the implications of adoption and use of a particular technology can and should be examined from a variety of perspectives, including economics and politics, but the ethical perspective is especially important because it is normative.

For Moor task of computer ethics is filling policy vacuums by sorting out conceptual muddles, for example conceptualizing computer software to best fit prevailing intellectual property law.

(9) Moor (1985) describes the task of computer ethics as that of filling policy vacuums. According to Moor, when computers create new possibilities, there is a vacuum of policies. The new possibilities take us into uncharted territory, situations in which it is unclear what is at issue or which moral norms are relevant.
(9) Filling the policy vacuum involves sorting out what Moor refers to as conceptual muddles. To illustrate a conceptual muddle, consider another case from the early days of computing computer software. When computer software was first created, the challenge was to figure out how best to conceptualize it. The problem had to do with fitting computer software to prevailing intellectual property law; copyright and patent seemed the best possibilities.
(9-10) The question of whether to grant copyright or patents for computer programs was, then, deeply linked to the conceptualization of computer programs. That is, the policy vacuum couldn't be filled without a conceptualization of software.

Summary of standard account of computer ethics is to address conceptual muddles to fill policy vacuums resulting from new possibilities created by information technologies.

(10) In summary, then, according to the standard account of computer ethics: (1) ethical issues arise around IT because IT creates new possibilities for human action and there is a vacuum of policies with regard to the new possibilities, (2) the task of computer ethics is to evaluate the new possibilities and fill the policy vacuums, and (3) a significant component of this task is addressing conceptual muddles.

An Update to the Standard Account

Standard account not specific to IT but rather focuses on new technologies in general at their introduction stage.

(10) The point is that the standard account can be used to explain ethical issues around new technologies in general, and is not specific to IT ethics.
(11) The focus of attention is on one, and only one, stage in the lifecycle of technology, the stage in which it is first introduced.

Policy vacuums often filled by defaults that perpetuate existing tensions or bad policy decisions, all of which ethical analysis may reveal.

(11) For one thing, policy vacuums sometimes go unfilled or they get filled, but in ways that perpetuate struggle or tension over the policy. Sometimes policy vacuums are resolved with bad policies, policies with negative or undesirable consequences. In any of these cases, ethical analysis can have an important role in critiquing policies that have already formed, pointing to their misfit with notions of justice or responsibility or good consequences.

People already have well developed expectations and conceptual models about computer technologies; no longer new.

Response to emerging technology also conditioned by conceptual models, such as lifelong learning.

(11) The technology already has meaning and users already have well-developed expectations. In other words, people already have conceptual models of the technology and how it works; they have knowledge that informs how they approach and use new applications. . . . Hence, it no longer seems appropriate to frame computer ethics as a field focused exclusively on the newness or novelty of IT.

Focus on novelty bolsters impression that technologies developed in isolation and introduced to the market fully formed, though the social context is paramount, and a long history of missteps and chance happenings often shape it as SCOT theorists insist.

Contrast STS and SCOT model to decontextualized analysis epitomized by critique of writing in Phaedrus.

(11-12) When we focus on IT when it is new, we tend to think of the technology as arriving intact and being plopped into society where it is taken up and has an impact. This suggests that the technology came out of nowhere, or that it was developed in isolation from society and then “introduced.” . . . What a product or tool looks like—the features it includes, make it makes possible—has everything to do with the social context in which it was created and the context for which it was created.
(12) More often than not, successful technologies have gone through a long period of development with many missteps and unexpected turns along the way.

Surprising inaccuracy in basic personal computer history putting GUI ahead of command line while making point about privileged context of invention of Apple in a garage.

(12) The garage had electricity, and Wozniak had been trained as an electronic engineer. The new computers they designed used the existing technologies of mouse and on-screen icons.

Role of philosophers of computing to play role in design missed when presumption of technological determinism shunts consideration of different possibilities, though Johnson notes Nissenbaum TrackMeNot based on value sensitive design approach of IT ethics.

Missing link is to inspire cultivation of technically savvy IT ethicists, for which critical programming serves to fill not a policy but a staffing vacuum.

(12) When the possibility of different sorts of technology is pushed out of sight, IT ethicists miss an important opportunity for ethics to play a role in the design of IT systems. [Note: . . . Helen Nissenbaum, for example, helped to design a program called TrackMeNot that helps protect users' privacy when they use Google to search the Web. She and others have developed an approach to IT ethics that is referred to as value sensitive design.]

Is it acceptable to recommend how computer ethics should be done without being deeply involved in practice by adding another author to the team?

Socialtechnical systems perspective intended to widen scope of IT ethics to complete lifecycle, away from emphasis on newness and other shortcomings of standard account.

Consider looking at end of life of technologies for retrospective study and learning.

(12) In putting the emphasis on newness, the standard account tends to push out of sight other stages in the lifecycle of technologies.
(13) The central focus of the rest of this book will be on the role of IT in constituting the social and moral world. For this purpose, it will be helpful to adopt what we will refer to as the sociotechnical systems perspective.

THE SOCIOTECHNICAL SYSTEMS PERSPECTIVE

Science and technology studies corrects three mistakes made when thinking about technology, rejecting determinism, material object, neutrality with coshaping, sociotechnical systems, value infused.

(13) To provide a quick overview of the core ideas in STS, we can think of STS as identifying three mistakes that should be avoided in thinking about technology. Parallel to each of the three mistakes is a recommendation as to how we should think about technology and society.

Reject Technological Determinism/Think Coshaping

Two claims of technological determinism are that it develops independently of society but then determines character of society once adopted.

(13) technological determinism fundamentally consists of two claims: (1) technology develops independently from society, and (2) when a technology is taken up and used in a society, it determines the character of that society.

STS reveals social factors influencing development in addition to natural constraints: government agency decisions, social incidents, market forces, legal environment, cultural sensibilities.

(13) The character and direction of technological development are influenced by a wide range of social factors.
(14) Nature cannot be made to do just anything that humans want it to do. Nevertheless, nature does not entirely determine the technologies we get. Social factors steer engineers in certain directions and influence the design of technological devices and systems.

Stirrup claim leading by feudal society by Lynn White exemplifies second sense of technological determinism, echoed by McLuhan; recent version is that Internet adoption leads to democracy.

(14) Perhaps the most famous statement of this was historian Lynn White's claim (1962) that from the invention of the stirrup came feudal society. . . . Certain writers have suggested that when countries adopt the Internet, it is just a matter of time before democracy will reign. . . . This is an expression of technological determinism in the sense that it implies that a technology will determine the political structure of a country.

Social factors affect design, use and meaning, making a position of cocreation more appropriate than determinism; compare to Hayles intertwining technogenesis and synaptogenesis.

(14) Social factors affect the design, use, and meaning of a technology, and in this respect society can push back and reconfigure a technology, making it into something its designers never intended.
(15) In effect, the STS counter to each tenet of technological determinism is the same; society influences technology. . . . The postive recommendation emerging out of this critique of technological determinism is that we acknowledge that technology and society cocreate (coshape; coconstitute) one another.

Reject Technology as Material Objects/Think Sociotechnical Systems

Artifacts only have meaning when embedded in social practices, thus technology is a social product involving network of communities and activities, Hughes sociotechnical systems; compare to analysis of ensoniment by Sterne.

(15) To be sure, artifacts (human-made material objects) are components of technology, but artifacts have no meaning or significance or even usefulness unless they are embedded in social practices and social activities. . . . Producing a computer involves the organization of people and things into manufacturing plants, mining of materials, assembly lines, distribution systems, as well as the invention of computer languages, education and training of individuals with a variety of expertise, and more. In other words, technology is a social product.
(16) STS theorists recommend that we think of technology as
sociotechnical systems (Hughes, 1994).

Same STS/SCOT idea that extends technology beyond artifacts gives matter to code.

(17) The material world powerfully shapes what people can and cannot do. However, we will be misled if we look only at artifacts. In fact, it could be argued that it is impossible to understand a technology by looking at the artifact alone. This would be like trying to understand the chess piece called “the rook” without knowing anything about the game of chess (the rules of the game, the goal, or other chess pieces).

Reject Technology as Neutral/Think Technology Infused with Values

To Winner technology is never neutral, as adoption implies adopting a particular social order, and then enforcing it, such as hierarchical decision making system with nuclear power; compare to Edwards closed world, Golumbia cultural logic of computation, Lessig embedded laws, and Lanier siren servers.

(17) Perhaps the most influential work on this topic is Langdon Winner's 1986 piece, “Do artifacts have politics?” Winner draws attention to the relationship between technology and systems of power and authority, arguing that particular technologies cannot exist or function without particular kinds of social arrangements. He argues that adoption of a particular technology means adoption of a particular social order. His example is that of nuclear power.
(17) In explaining this relationship between technologies and patterns of authority and decision making (which may seem quite deterministic), Winner provides a powerful example of how an artifact can enforce social biases and privilege individual agendas.

Johnson argues Winner goes too far and slips back into technological determinism in arguing against neutrality.

(18) Winner can be interpreted as slipping into the mistake of technological determinism. He seems to be suggesting that a technology—the bridges of Long Island—determined the social order.

SOCIOTECHNICAL COMPUTER ETHICS

STS recommendations found sociotechnical computer ethics, using story of Facebook to exemplify each point: its situated development by Zuckerberg, coconstitution of human and nonhuman components, and embedded values of various stakeholders.

(18) The three STS recommendations provide the foundation of what we will call “sociotechnical computer ethics.”
(18) The story of Facebook's development goes right to the heart of the first STS theme in the sense that Facebook was not the “next logical development in the natural evolution of IT”; Facebook didn't come out of nowhere. It was created by Mark Zuckerberg while he was at Harvard and thought it would be fun to create something that would support social interactions among students.
(18-19) Perhaps the second STS lesson—not to think of technology as material objects—doesn't even need emphasizing to Facebook users because they think of the site not just as a material object or piece of software, but as a “social” networking site. . . . So, users were confronted with the fact that the system is not simply lines of code; it is partly lines of code, but the lines of code are written and maintained by programmers who take directions from administrators who respond to a variety of stakeholders, including users. Facebook is a sociotechnical system with many human and nonhuman components.
(19) As a social networking site, Facebook is far from neutral. . . . Although the system makes individuals quite transparent to their friends, the Beacon schema bumped up against many users' desire for some sort of privacy about shopping. . . . Changes in the architecture change the values embedded in the system.

STS perspective gives richer and more accurate understanding of situations in which moral questions arise or may be discovered as unknown knowns; Johnson returns to scenario of whether to insert RFID chip in elderly parent to illustrate unavoidability of having to take into account more factors to make better decisions.

(19) The short answer is that the perspective gives us a fuller, more accurate, and richer understanding of situations in which moral questions and dilemmas arise. We can illustrate this by focusing on Scenario 1.3 in which an individual must make a decision about whether to have an RFID device implanted in her mother.
(19) The first step is to keep in mind that RFID is a sociotechnical system, not simply a material object. . . . The system is a combination of material chips together with social practices involving implantation of the tag, display of the data produced by the tag, interpretation of the data, and responses to the data.
(20) Yes, the sociotechnical systems perspective seems to generate more questions than someone without the perspective would have thought to ask. Although this may seem a burden, it is unavoidable that better decisions involve taking into account more factors. Yet the sociotechnical system perspective doesn't just expand the rage of factors to be taken into account; it helps in identifying or articulating particular kinds of concerns, and reveals new opportunities for resolution or intervention.

MICRO- AND MACRO-LEVEL ANALYSIS

Sociotechnical systems perspective draws more attention to macro level issues, but in the process enhances analysis of micro level issues.

(21) Because the sociotechnical perspective frames technology as a system, it seems to draw more attention to macro-level issues. However, as we saw in our analysis of Kathy's situation, macro analysis enhances micro-level analysis. Thus, the sociotechnical systems perspective is compatible with, and useful to, both levels of analysis.

RETURN TO THE “WHY COMPUTER ETHICS?” QUESTION

Believes better choices will derive from better understanding about sociotechnical systems; computer ethics focuses on role of IT in constituting the moral world.

(21-22) The better we understand technology and how it shapes and is shaped by morality, the better our choices and decisions are likely to be. . . . Different technologies affect human activity and forms of life differently. The field of computer ethics focuses specifically on the role of IT in constituting the moral world.

Conclusion

Study Questions






Johnson, Deborah G. Computer Ethics: Analyzing Information Technology. 4th ed. Upper Saddle River, NJ: Prentice Hall, 2009. Print.