Notes for Douglas Rushkoff Program or Be Programmed: Ten Commands for a Digital Age

Key concepts: .


Related theorists: Walter Benjamin, David Berry, Andy Clark, Paul Edwards, Alexander Galloway, N. Katherine Hayles, John Kemeny, Friedrich Kittler, Lawrence Lessig, Lev Manovich, Neil Postman.

PREFACE

Assessing the situation of digital environments, thinking of Kittler, requires understanding programming as programmer or critical thinker.

(8) Understanding programming—either as a real programmer or even, as I'm suggesting, as more of a critical thinker—is the only way to truly know what's going on in a digital environment, and to make willful choices about the roles we play.

Comparison to driving versus being a passenger; think of Engelbart bulldozer.

(9) The difference between a computer programmer and a user is much less like that between a mechanic and a driver than it is like the difference between a driver and a passenger.

Diminishing chances of having a choice in digital matters by relegating programming to others.

(9) You can relegate your programming to others, but then you have to trust them that their programs are really doing what you're asking, and in a way that is in your best interests. And the longer you live this way, the less access you have to the knowledge that it could be any other way, or that you ever had a choice in the matter.

Realization that world is read write, not just read only.

(10) From that moment on, everything changed. I realized I was not living in a “read only” universe, but in a “read/write” one.

After realization about programming, sees programs designed for planned outcomes everywhere: economy, religion, politics.

(11) Our economy, our religions, our politics . . . everything was a program designed to do something. Some of those purposes may be long forgotten, and some of them may not be proceeding according to plan. But they were programs all the same.

Hacker ethos, capability of effecting real change, applied to various cultural phenomena and institutions.

(11) Learning about digital technology helped me to see the world in an entirely new way—as a programmer, or better, a hacker capable of effecting real change. And so I wrote books applying this hacker ethos to many different cultural phenomena and institutions.

Sense the return to understanding programming puts humans back in control of steering civilization, fitting better with WALL-E imagery than driving off a cliff.

(11) I do want people to know something about programming, but more than that, I want them to consider putting their own hands back on the steering wheel of our civilization. It may just keep us from driving off a cliff.


Introduction

Civilization on an important threshold: program or be programmed.

(13-14) In the emerging, highly programmed landscape ahead, you will either create the software or you will be the software. It's really that simple: Program, or be programmed. Choose the former, and you gain access to the control panel of civilization. Choose the latter, and it could be the last real choice you get to make.

Imagines as an alternative trajectory to driving off a cliff transformations of shared, networked, extended consciousness and cognition, at the same time fitting in with orality and literacy periodization.

(14) Just as words gave people the ability to pass on knowledge for what we now call civilization, networked activity could soon offer us access to shared thinking—an extension of consciousness still inconceivable to must of us today. The operating principles of commerce and culture—from supply and demand to command and control—could conceivably give way to an entirely more engaged, connected, and collaborative mode of participation.

Social hopes for Internet seem to be failing, draining values and denying deep thinking rather than fostering highly articulated connections and new forms of creativity.

(16) A society that looked at the Internet as a path toward highly articulated connections and new methods of creating meaning is instead finding itself disconnected, denied deep thinking, and drained of enduring values.

Extended mind taking form of cybernetic mob rather than new collective human brain; humans reduced to externally configured nervous systems while computers inter networking in more advanced ways.

(17) It is that thinking itself is no longer—at least no longer exclusively—a personal activity. It's something happening in a new, networked fashion. But the cybernetic organism, so far, is more like a cybernetic mob than new collective human brain. People are being reduced to externally configurable nervous systems, while computers are free to network and think in more advanced ways than we ever will.

Teaching kids to write with software seems enough of a response to formerly unidirectional, producer biased mass media, but should be writing software; programming is the underlying capability of the era, as Heidegger noted.

(19) Computers and networks finally offer us the ability to write. And we do write with them on our websites, blogs, and social networks. But the underlying capability of the computer era is actually programming—which almost none of us knows how to do. We simply use the programs that have been made for us, and enter out text in the appropriate box on the screen. We teach kids how to use software to write, but not how to write software. This means they have access to the capabilities given to them by others, but not the power to determine the value-creating capabilities of these technologies for themselves.

Masses one full dimensional leap behind those in power, potentially releasing collective agency to machines instead of elite human groups, for they are also not the ones who design what those in power manipulate effortlessly to their advantage, they are more like addicts, lucky they know how to operate them; this is the danger foreshadowed, in different ways, by movies Fail Safe and WALL-E.

Fixed slip in typing occluding failure code be with me we, not be or me but mean.

(20) As a result, most of society remains one full dimensional leap of awareness and capability behind the few who manage to monopolize access to the real power of any media age.
(20) Before, failing meant surrendering our agency to a new elite. In a digital age, failure could mean relinquishing our nascent collective agency to the machines themselves.

That cognition is replicated in extrahuman mechanisms make digital age different from literary.

(21) We are not just extending human agency through a new linguistic or communications system. We are replicating the very function of cognition through external, extra-human mechanisms.

Assumption that even basic knowledge of how devices are programmed, or input into decision and design processes, provides a solution to the immanent relegation of agency; compare getting on top by cultivating basic knowledge to Berry riding on top of streams.

Using computers and networks different than using calculators because we barely know what we are asking them to do, and hardly teaching them how to do it (Kemeny).

(23) The way to get on top of all this, of course, would be to have some inkling of how these “thinking” devices and systems are programmed—or even to have some input into the way it is being done, and for what reasons.
(23) With computers and networks, unlike our calculators, we don't even know what we are asking our machines to do, much less now they are going to go about doing it. Every Google search is—at least for most of us—a Hail Mary pass into the datasphere, requesting something from an opaque black box.

Need sustained thought of literary humanist subject thinking alone or in small, self-selected groups.

(24) The sustained thought required now is the sort of real reflection that happens inside a human brain thinking alone or relating to others in small self-selecting groups, however elitist that may sound to the techno-mob.

Need for human response to technologies, a new ethical template, akin to codification by Torah and Talmud of changes brought on by literacy.

(25) We are aware of the many problems engendered by the digital era. What is called for now is a human response to the evolution of these technologies all around us. We are living in a different world than the one we grew up in—one even more profoundly different than the world of the alphabet was from the oral society that existed for millennia before it. That changing society codified what was happening to it through the Torah and eventually the Talmud, preparing people to live in a textual age. Like they did, we need to codify the changes we are undergoing, and develop a new ethical, behavioral, and business template through which to guide us. Only this time it must actually work.

Offers ten commands to balance recognized biases of digital media; others would add nuances of embedded cultural aspects as present regime of historically contingent capitalist digital media (Hayles, Manovich, Edwards, Malabou).

(26) Each command is based on one of the tendencies or “biases” of digital media, and suggests how to balance that bias with the needs of real people living and working in both physical and virtual spaces—sometimes at the very same time.
(26) A bias is simply a leaning—a tendency to promote one set of behaviors over another. All media and all technologies have biases.

Understanding biases is the guiding philosophy for getting on top of the problem posed by rapidly transforming technologies that seem to have taken command on their own (Kittler).

(27) We can't quite feel the biases shifting as we move from technology to technology, or task to task. Writing an email is not the same as writing a letter, and sending a message through a social networking service is not the same as writing an email. Each of the acts not only yields different results, but demands different mindsets and approaches from us.
(27) Only by understanding the biases of the media through which we engage with the world can we differentiate between what we intend, and what the machines we're using intend for us—whether they or their programmers even know it.


I. TIME
Do Not Be Always On

Digital technologies biased toward asynchronicity, away from progression of time familiar to consciousness to decision by decision operation.

(30) Digital technologies are biased away from time, and toward asynchronicity.
(30-31) Instead of operating in time, computers operate from decision to decision, choice to choice.

Programs encourage human behaviors biased toward decisions.

An example of space of flows in machine embodiment is defaulting running in a loop of continuous time even though Rushkoff claims machine cognition ignores continuous time in its succession of command prompt choice responses shows us we really do not know what our machines think; occurring outside time should be supplemented by also occurring in precise time.

(31) Because computer code is biased away from continuous time, so too are the programs built on it, and the human behaviors those programs encourage. Everything that we do in the digital realm both benefits and suffers from its occurrence outside time.

Consider evolution of time sharing and networking that yielded this outcome, especially now that we do not notice it as their operation is less noticeable.

(31-32) The underlying asynchronous quality of email and conferencing was much more obvious to us back then, because we all saw the way these tools really worked. Back then we all saw the way these tools really worked. Back then, phone calls still cost money, as did our access time. So our computers generally went online, logged into a server, downloaded everything we were supposed to see, and then logged off again.

Remote control example of interactive device for breaking time, also exercising power over biased media programming epitomized by advertising; however, escape from ads by changing channels to other media streams as example of deconstruction of story points to absence of coherent substitute, leading to surfing mode.

(32) The interactive urge itself—even before computers came into our lives—was consistent with this desire to break time. The first interactive device most of us ever used was the remote control.
(33) But after the remote control, escape from the advertiser's spell becomes effortless. With a micro-motion of the thumb, we are gone. The interactive device introduces discontinuity into an otherwise continuous medium. And this discontinuity—this deconstruction of story—is a form of power.

Sense of suboptimal mashup substitute and consequent always-on condition.

(34) The spirit of the digital age still finds its expression in this reappropriation of time. Our cutting and pasting, mash-ups and remixes, satires and send-ups all originate in this ability to pause, reflect, and rework.
(34) Our devices and, by extension, our nervous systems are now attached to the entire online universe, all the time.

Fault is the way we use technology, believing we are proficient multitaskers like our operating systems.

(35) No matter how proficient we think we are at multitasking, studies show our ability to accomplish tasks accurately and completely only diminishes the more we try to do at the same time. This is not the fault of digital technology, but the way we use it.
(35) And so we sacrifice the thoughtfulness and deliberateness our digital media once offered for the false goal of immediacy—as if we really can exist in a state of perpetual standby.

Phantom vibration syndrome example of everyday symptom of trying to exist in state of perpetual standby like our machines.

(36) For the first time, regular people are beginning to show the signs of stress and mental fatigue once exclusive to air traffic controllers and 911 operators. Cell phone users now complain of “phantom vibration syndrome,” the sensation of a cell phone vibrating on your thigh, even though there's no phone in your pocket.

Ethical suggestion is refuse to always be on; choose when to be available.

(37) Of course, the simplest way out is to refuse to be always on. To engage with the digital—to connect to the network—can still be a choice rather than a given. That's the very definition of autonomy. We can choose to whom or what we want to be available, and when.

Offloading processes, not just information (Clark), preparing for evolutionary transformation of perpetual network connectivity.

(39) We have been consistently using our brains less as hard drives and more as processors—putting our mental resources into active RAM. What's different now, however, is that it's not just lists, dates, and recipes that are being stored for us, but entire processes.
(40) Are we choosing to surrender the ability to do it without digital assistance? If so, are we prepared to remain connected to our networks all the time? What new ability, if any, are we making froom for in the process?


II. PLACE
Live in Person

Example of socialite living through network and ignoring those present.

(43) She relates to her friends through the network, while practically ignoring whomever she is with a the moment. She relates to the places and people she is actually with only insofar as they are suitable for transmission to others in remote locations. The most social girl in her class doesn't really socialize in the real world at all.

Digital media biased away from the local, toward dislocation, locationlessness as a result of design.

(43) While the intent of digital networks was not to disconnect a high school girl from her real world friendships, the bias of the networks were absolutely intended to favor decentralized activity.
(43) As a result,
digital media are biased away from the local, and toward dislocation.

Promotion of long-distance interests; competition with local interests through mass and now directed media.

(44) As the promoters of distance over the local, media have also promoted the agendas of long-distance interests over those of people in localities.
(46) Mass media became the non-local brand's way of competing against the people with whom we actually worked and lived.
(47) the non-local bias of the net is accepted as a means to an end: We go online in order to communicate with people who are not with us at that moment, and hopefully to arrange a time and place to meet for real.
(47) Interactive technology has also allowed for conversations to take place in a media landscape that formerly promoted only one-way broadcast.

Loss of periphery effects of embodied interaction, in particular giving and accepting kindness.

(48) In fact, the ease with which a simulated, digital connection is made can often make us more distant. . . . It takes less effort but it's also less beneficial for everybody concerned. Giving and accepting kindness used to be part of [(sic)] package.

Fetishization of tools, as in technopoly (Postman).

(48) As we come to depend on the net for our sense of connection to each other and the world, we end up fetishizing the tools through which all this happens.
(49) And so we begin to use long-distance technologies by default, even in local situations where face-to-face contact would be eaiser.

Rushkoff insists on valuing full spectrum personal encounter by not using mediating electronic presentation.

(50) No, the reason to spend the jet fuel to bring a human body across a country or an ocean is for the full-spectrum communication that occurs between human beings in real spaces with one another. The digital slideshow, in most cases, is a distraction—distancing people from one another by mediating their interaction with electronic data.


III. CHOICE
You May Always Choose None of the Above

Analog recording based on physical inscription, digital series of choices via engineered transduction into discreet differences.

(53) The analog recording is a physical impression, while the digital recording is a series of choices.
(54) In the digital recording, however, only the dimension of the sound can be measured and represented in numbers are taken into account.

Digital recording fundamentally different phenomenon than analog; like virtual communication of last chapter and formant synthesis.

(54-55) But the problem is not that the digital recording is not good enough—it is that it's a fundamentally different phenomenon from the analog one.

Digital biased toward milieu of constant choice making, as Kitchin and Dodge describe code spaces.

(55) The digital realm is biased toward choice, because everything must be expressed in the terms of a discrete, yes-or-no, symbolic language. This, in turn, often forces choices on humans operating within the digital sphere.
(56) The issue here is that even if our would is made of pure information, we don't yet know enough about that data to record it.

Humans accommodate computers constantly making discrete choices by living and defining ourselves in their terms; consider compromises in history of cybernetics (Hayles).

(57) And while our computers are busy making discrete choices about the rather indiscrete and subtle world in which we live, many of us are busy, too—accommodating our computers by living and defining ourselves in their terms.

Unfreedom of continuous, forced choice; dovetails consumer identity.

(58) Still, there's a value set attending all this choice, and the one choice we're not getting to make is whether or not to deal with all this choice.
(58-59) Digital technology's bias toward forced choices dovetails all too neatly with our roles as consumer, reinforcing this notion of choice as somehow liberating while turning our interactive lives into fodder for consumer research.

Choice as invitation to sell: compare to Janz on formulating research questions using web search.

(59) choice is less about giving people what they want then getting them to take what the choice-giver has to sell.

Entrained striations, choice filter creation, programmed visions: trajectory of WALL-E humans.

(59) Meanwhile, the more we learn to conform to the available choices, the more predictable and machineline we become ourselves. We train ourselves to stay between the lines, line an image dragged onto a “snap-to” grid.
(59) Likewise, through our series of choices about the news we read, feeds to which we subscribe, and websites we visit, we create a choice filter around ourselves.

Tagging as conscious intervention response to forced choices that databases and other programs will eventually evolve to accommodate if not become connectionist like collective cognition.

(60) One emerging alternative to forced, top-down choice is in the digital realm is “tagging.” . . . While traditional databases are not biased toward categorizing things in an open-ended, bottom-up fashion, they are capable of operating this way. . . . It's all in the programming, and in our awareness of how these technologies will be biased if we do not intervene consciously in their implementation.


IV. COMPLEXITY
You Are Never Completely Right

Bias toward reduction of complexity with expectation that network will respond.

(62) Thanks to its first three biases, digital technology encourages us to make decisions, make them in a hurry, and make them about things we've never seen for ourselves up close.
(62) This makes digital technology—and those of us using it—biased toward a reduction of complexity.
(62) The pursuit itself is minimized—turned into a one-dimensional call to our networks for a response.

Difference between information delivered by the network and knowledge reached through genuine inquiry reminiscent of Phaedrus and Kemeny.

(63) We only get into trouble if we equate such cherry-picked knowledge with the kind one gets pursuing a genuine inquiry.
(64) It re-creates the process of discovery, putting the researcher through the very motion of cognition rather than simply delivering the bounty.
(65) Both sides in a debate can cherry-pick the facts that suit them—enraging their constituencies and polarizing everybody.

Treat data as untested models whose relevancy is conditional and personal, looking to development of channel surfer skill of quickly getting gist of entire areas of study, recalling Lawnmower Man.

(66-67) To exploit the power of these new arrangements of data, we must learn to regard them as what they are: untested models, whose relevancy is at best conditional or even personal.
(67) Young people, in particular, are developing the ability to get the gist of an entire area of study with just a moment of interaction with it. With a channel surfer's skill, they are able to experience a book, movie, or even a scientific process almost intuitively.

Reading as process of elimination, knowing how not to know what does not have to be known as curious reversal of uncovering unknown knows that may help push us toward becoming dumber.

(67-68) Reading becomes a process of elimination rather than deep engagement. Life becomes about knowing how not to know what one doesn't have to know.

Striating reduction of complexity in experience of world through technological upgrades.

(68) The more complex our technologies become, and more impenetrable their decision-making (especially our increasingly simplified, gist-of-it brains), the more dependent on them we become. Their very complexity becomes the new anxiety, replacing the adman's storytelling as the predominant form of social influence.
(68) With each upgrade in technology, our experience of the world is further reduced in complexity.

Experience of virtual worlds adjusts our senses, decreases perceptual abilities along striations of optimized algorithms like MP3, maps mistaken as the journey.

(70) our inability to distinguish between a virtual reality simulation and the real world will have less to do with the increasing fidelity of simulation then the decreasing perceptual abilities of us humans.
(70) The MP3 algorithm has a way of creating the sensation of bass, the sensation of high notes, and so on. Listening to these files in lieu of music, however, seems to strain or even retrain our hearings.
(70) Digital reduction yields maps. These maps are great for charting a course, but they are not capable of providing the journey.


V. SCALE
One Size Does Not Fit All

Ability to scale and move up levels of abstraction key business survival skill in digital realm.

(74) On the net, everything is occurring on the same abstracted and universal level. Survival in a purely digital realm—particularly in business—means being able to scale, and winning means being able to move up one level of abstraction beyond everyone else.

Scaling and derivative strategies reflect common layered models in computer hardware, software systems, networks: diachronies in synchrony.

(75) Because the net is occurring on a single, oversimplified and generic level, success has less to do with finding a niche than establishing a “vertical” or a “horizontal.” . . . In either case, “scaling up” means cutting through the entire cloud in one direction or another: becoming all things to some people, or some things to all people.
(75-76) In an abstracted universe where everything is floating up in the same cloud, it is the indexer who provides context and direction.
(76) Of course, this logic dovetails perfectly with a financial industry in which derivatives on transactions matter more than the transactions themselves.

Centralization, standards and hierarchies at heart of networks and digital media; compare to analyses of Galloway and Lessig.

(77) What all this abstraction does accomplish here on earth, however, is make everyone and everything more dependent on highly centralized standards. Instead of granting power to small businesses on the periphery, the net ends up granting even more authority to the central authorities, indexers, aggregators, and currencies through which all activity must pass. Without the search engine, we are lost. Without centrally directed domain name servers, the search engines are lost. Further, since digital content itself needs to be coded and decoded, it requires tremendous standardization from the outset. Far from liberating people and their ideas from hierarchies, the digital realm enforces central control on an entirely new level.

Reliance on familiar brands, trusted authorities, generic symbols to gain bearings due to abstraction and lack of local interaction.

(77-78) On a more subtle level, the abstraction intrinsic to the digital universe makes us rely more heavily on familiar brands and trusted authorities to gain our bearings. . . . Learning, orienting, and belonging online depend on universally accepted symbols or generically accessible institutions.

All media are biased toward abstraction, representing other media.

(78) In fact, all media are biased toward abstraction in one way or another.

Hypermedia disconnection from author and context, forming nexus of abstracted connections, a world of symbols about symbols; think in terms of Benjamin aura and postmodern simulacrum.

(80) Finally, the digital age brings us hypertext—the ability for any piece of writing to be disconnected not just from its author but from its original context.
(80) But from a practical and experiential perspective, we are not talking about the real world being so very connected and self-referential, but a world of symbols about symbols. Our mediating technologies do connect us, but on increasingly abstracted levels.
(81) The original painting, hanging in the very cathedral for which it was painted perhaps, has what Benjamin called an “aura,” which is at least partly dependent on its context and location.

Notes a symptom is our behavior in collecting mass produced consumer widgets, now including old personal computers that can be archives of meaning, speech synthesis, recalling reading Diogenes Laertius in Greek; let this be an entry to working code via critical programming.

We fetishize premodernity based on misunderstood configurations and uses of precommodities, though for that class of assemblies that respond, those involving real and emulated early computing machinery emit a particular aura whose significance we just cannot quite discern but feel is there, drawing Bogost and Montfort to the Atari VCS and various languages like Commodore BASIC.

(81) Strangely enough, as we migrate from his world of mass-produced objects to the realm of even more highly abstracted digital facsimiles, we nostalgically collect the artifacts of midcentury mass production is if they were works of art.

Digital simulations are simulacra, abstracted representations of games and math; postmodern fear of entrancing simulation resulting in disconnection from people and places, culminating in Turkle robotic moment.

(83-84) While games and math might be abstracted representations of our world, our digital simulations are abstracted representations of those games and mathematics. . . . As the postmodernists would remind us, we have stuff, we have signs for stuff, and we have symbols of signs. What these philosophers feared was that as we came to live in a world defined more by symbols, we would lose touch altogether with the real stuff; we would become entranced by our simulated reality, and disconnect from the people and places we should care about.

Saving power is that tools for manipulating symbolic worlds remain accessible; the enthymeme involves taking to heart the ten commands Rushkoff develops.

(84) What the postmodernists may have underestimated, however, was the degree to which the tools through which these symbolic worlds are created—and ways in which they might be applied—would remain accessible to all of us.


VI. IDENTITY
Be Yourself

Crowd behavior engendered by anonymous online status, experience of acting from a distance in secrecy, exacerbating dehumanizing tendencies of digital technology, becoming an angry mob; contrast to effects of dehumanizing tendencies magnified by punch card machinery, becoming an automatic machine.

(88) But more than simply protecting them from retribution, the anonymous status of people in an online group engenders crowd behavior. They have nothing to fear as individuals, and get used to taking actions from a distance and from secrecy. As a result, they exacerbate digital technology's most dehumanizing tendencies, and end up behaving angrily, destructively, and automatically. They go from being people to being a mob.
(89)
The less we take responsibility for what we say and do online, the more likely we are to behave in ways that reflect our worst natures—or even the worst natures of others. Because digital technology is biased toward depersonalization, we must take an effort not to operate anonymously, unless absolutely necessary. We must be ourselves.

Duration is crucial for building out of body reputation.

(91) Like an eBay “seller rating,” the more time it has taken to acquire a reputation in an online environment, the more it matters—even when it is entirely out of body.

Online experience autistic with Asperger syndrome more so than unprejudiced intellectual; dependent on small percentage of human communication occurring on verbal level.

(91-92) Our experience is less that of the unprejudiced intellectual than that of the autistic living with Asperger's syndrome. . . . a dependence on the verbal over the visual, low pickup on social cues and facial expressions, apparent lack of empathy, and the inability to make eye contact.
(92) But online, we are depending entirely on that tiny 7 percent of what we use in the real world. Absent the cues on which we usually depend to feel safe, establish rapport, or show agreement, we are left to wonder what the person on the other end really means or really thinks of us.

Compensatory exhibitionism combined with permanency robs youth of free experimentation, pushing towards more anonymity; resentment seeps into communications.

(93) As if desensitized by all this disembodiment, young people also exhibit an almost compensatory exhibitionism.
(94) Sadly for young people, the digital realm is permanent. This robs from them the free experimentation that defines one's adolescence.
(94) And this permanence, once fully realized and experienced, only pushes the more cynical user to increasing layers of anonymity.
(95) Once we surrender to the status of the anonymous, our resentment at having to do so will seep into our posts.

Ethic of developing comportment to permanence of net life by maintaining strict sense of identity.

(95) On the other hand, maintaining a strict sense of identity online is liberating, even empowering. We realize that nothing we commit to the net is truly off the record, and learn not to say anything we aren't proud to see quoted, shared, and linked to.


VII. SOCIAL
Do Not Sell Your Friends

Social use overwhelmed early computing networks, finally opened for commercial use.

(97) While the Internet—then Arpanet—was a technological success, it had become overwhelmed by social use. . . . The government ended up setting the net free, to a large extent, with the proviso that it only be used for research purposes.
(98) Finally, after a series of violations by small businesses looking to promote their services online, the net was opened for commercial use.

Contact, not content, is king, birthing social media from infrastructure laid during dot-com boom.

(99) Left to our own devices, however, net users began to blog. And link. And comment. The manic investment of the dot-com boom had given us a robust network and fast connections, with which we could now do as we pleased. The web still had businesses on it, but the vast majority of connections and conversations were between people. It turned out, content is not king—contact is. And so what we now call “social media” was born.
(99) Smarter businesses took notice. AOL, GeoCities, Friendster, Orkut, MySpace, and Facebook have each risen to channel all this social energy into a single, centralized location where it can be monetized.
(99-100)
Our digital networks are biased toward social connections—toward contact. Any effort to redefine or hijack those connections for profit end up compromising the integrity of the network itself, and compromising the real promise of contact.

Anger over monetization of friendships by social networking sites.

(100) The anger people feel over a social networking site's ever changing policies really has less to do with any invasion of their privacy than the monetization of their friendships. The information gleaned from their activity is being used for other than social purposes—and this feels creepy. Friends are not bought and sold.

Concern that commingling commercial exploitation with sociality becomes normative behavior, possibly countered through awareness of how technologies enact this influence.

(102) If the social urge online comes to be understood as something necessarily commingled with commercial exploitation, then this will become the new normative human behavior as well.
(102) Who ends up exploited most, of course, is the person who has been convinced to behave this way. And that's where some awareness of how particular interfaces, tools, and programs influence our behavior is so valuable.
(103) Taking an action in a game instantly (and usually invisibly) turns one's entire network into a spam distribution list—selling her friends, and her friends-of-friends, to the game's real clients: market researchers and advertisers.

Social media contacts hint at potential future forms of collective organism.

(104) People are not things to be sold piecemeal, but living members of a network whose value can only be realized in a free-flowing and social context.
(105) The content is not the message, the contact is. The ping itself. It's the synaptic transmission of an organism trying to wake itself up.


VIII. FACT
Tell the Truth

Bazaar as prior social space for information exchange based largely on facts.

(106-107) Before what we think of as media even existed, the majority of our information exchange took place at the bazaar—the market and social space where people gathered to buy and sell goods, meet up with friends and, probably most importantly, learn what was happening in their world.
(107) The interactive medium of the day—conversation--was almost entirely based in facts.

Companies replaced interpersonal relationships with brands and myths.

(109) Sadly, along with the peer-to-peer economy went peer-to-peer communication. Companies tried to replace what had been relationships between people with relationships to brands. . . . To make this transition work, brands turned to the sorts of mythologies still in use today. The Quaker on a package of oats has nothing to do with the grain in the box; he is a story.
(110) People were working hard on assembly lines or in cubicles anyway, no longer experiencing themselves in their multiple social roles simultaneously. They were workers on the job trying to earn a paycheck, and consumers at home relaxing to the mythological drone of mass media.

Interactivity of digital media remediates the bazaar.

(110) The fundamental difference between mass media and digital media is interactivity.
(111) We're back in the bazaar. Only instead of individuals conversing one-on-one with our local friends and associates, each of us has a global reach greater than that of most broadcast television networks.

Interactions in digital media shift toward nonfiction, encouraging truth telling ethic.

(112) The bias of our interactions in digital media shifts back toward the nonfiction on which we all depend to make sense of our world, get the most done, and have the most fun. The more valuable, truthful, and real our messages, the more they will spread and better we will do. We must learn to tell the truth.
(112-113) The information is still being presented and accepted as fact by newly minted digital citizens working against centuries of mythological control.
(113) As a person's value and connections in the digital realm become dependent on the strength of their facts and ideas, we return to a more memetic, fertile, and chaotic communications space.

Actions more memetic then words; need to abandon brand mythology and return to communicating attributes.

(115) The beauty—and, for many, the horror—is that actions are even more memetic than words.
(115) In advertising, this means abandoning brand mythology and returning to attributes.

Social skill in sharing useful facts and disregarding nonsense.

(116) Likewise, people will thrive in a digital mediaspace as they learn to share the facts they've discovered and disregard the nonsense.
(116-117) Those who succeed as communicators in the new bazaar will be the ones who can quickly evaluate what they're hearing and learn to pass on only the stuff that matters. . . . But the real winners will once again be those who actually discover and innovate—the people who do and find things worth of everyone else's attention.



IX. OPENNESS
Share, Don't Steal

Network designs reflect working ethos based on sharing and openness of their creators, which also makes them vulnerable to attack.

(119-120) Perhaps because they witnessed how effective distributed processing was for computers, the builders of the networks we use today based both their designs as well as their own working ethos on the principles of sharing and openness. . . . This is what makes the Internet so powerful, and also part of what makes the Internet so vulnerable to attack: Pretty much everything has been designed to talk to strangers and offer assistance.

Bias toward openness from architecture of shared resources and gift economy origins challenge distinction between sharing and stealing.

(120) This encouraged network developers to work in the same fashion. The net was built in a “gift economy” based more on sharing than profit.
(120-121) Digital technology's architecture of shared resources, as well as the gift economy through which the net was developed, have engendered a bias toward openness. It's as if our digital activity wants to be shared with others. As a culture and economy inexperienced in this sort of collaboration, however, we have great trouble distinguishing between sharing and stealing.
(123) What we're in the midst of now is a mediaspace where every creation is fodder for every other one.

Telepathy as potential evolutionary transformation of collective awareness and thinking latent in openness of networks and sharing of digital media.

(124) We are living in an age when thinking itself is no longer a personal activity but a collective one. . . . Many young people I've encountered see this rather terrifying loss of privacy and agency over our data as part of a learning curve. They see the human species evolving toward a more collective awareness, and the net's openness as a trial run for a biological reality where we all know each other's thoughts through telepathy.
(125) These same social norms do not yet apply to the net, where sharing, borrowing, stealing, and repurposing are all rather mashed up themselves.

DRM robbery of local resources and network bandwidth; link to Kittler criticism of protected mode and trusted computing.

(126) In a sense, these DRM strategies constitute a kind of robbery themselves. In order to work, these secretly planted programs must actually utilize some of the capacity of our computer's processors.

Advances of Creative Commons licensing and free software licenses help clarify muddles over use of digital media, but often equated with revolution of openness.

(126) Breaking copyright to steal and share music or movies becomes understood as the action of a legitimate openness movement, dedicated to access and equality for everyone. Very specific ideas about collaboration, such as open source development and Creative Commons licensing, are equated with a free for all revolution of openness.

Digital mediaspace extracting value from different places in production cycle incompatible with print based currency system.

(127) Value is still being extracted from the work—it's just being taken from a different place in the production cycle, and not passed down to the creators themselves.
(128) By confronting the biases of digital media head-on, however, we can come to terms with the seeming paradox of ownership in a digital mediaspace.
(129) The real problem is that while our digital mediaspace is biased toward a shared cost structure, our currency system is not. We are attempting to operate a twenty-first century digital economy on a thirteenth-century, printing-press-based operating system.

Direct commerce and peer to peer transactions based on abundance of production rather than scarcity of lending; disruption of core capitalism.

(130) Instead of buying from and selling to one another through highly centralized corporations, we now have the technology required to buy from and sell to one another directly.
(130) Instead of borrowing this money from a bank, users earn it into existence by making goods and performing services for other people in the same community. Peer-to-peer currencies are based in the abundance of production, rather than the scarcity of lending. This makes them biased,as is the net, toward transaction and exchange rather than hoarding for interest.

Golden rule as interim ethic; community agreement on abiding by its standards.

(132) Until that time, however, we are best governed not by what we can get away with, but how we want to be treated by others. The people on the other side of the screen spent time and energy on the things we read and watch. When we insist on consuming it for free, we are pushing them toward something much closer to the broadcast television model, where ads fund everything.
(132-133) We accept this model only because we don't know enough about how these systems work to make decisions about them intelligently. . . . While one of these statements may ultimately be legally enforceable, it is a system depending not on the courts but on the culture. To function, the community must agree to abide by its standards.

Participation depends on knowledge of programming and social codes.

(133) Participation is dependent on knowing both the programming code necessary to make valuable additions and the social codes necessary to do it in ways that respect the contributions of others.


X. PURPOSE
Program or Be Programmed

Public schools teach computer use, not programming; user orientation defining success as behaving in conformance with programmed visions, making us more striated.

(135-136) We do not teach programming in most public schools. Instead of teaching programming, most schools with computer literacy curricula teach programs. Kids learn how to use popular spreadsheet, word processing, and browsing software so that they can operate effectively in the high-tech workplace.
(136) Their bigger problem is that their entire orientation to computing will be from the perspective of users. . . . Success means learning how to behave in the way the program needs her to.

Cultural bias privileging consumption and design, while actual coding viewed as boring, foreclosing on fostering awareness of creative ground in working code places.

(137) Instead, we see actual coding as some boring chore, a working-class skill like bricklaying, which may as well be outsourced to some poor nation while our kids play and even design video games. . . . We lose sight of the fact that the programming—the code itself—is the place from which the most significant innovations emerge.

Good argument about consequences of ignorance of biases of automotive transportation, approaching as consumers rather than civic planners.

Dependence on private automobiles like dependence on proprietary software.

(137-138) Throughout the twentieth century, we remained blissfully ignorant of the real biases of automotive transportation. We approached our cars as consumers, through ads, rather than as engineers or, better, civic planners. We gladly surrendered our public streetcars to private automobiles, unaware of the real expenses involved. . . . As a result, we couldn't see that our national landscape was being altered to manufacture dependence on the automobile. We also missed the possibility that these vehicles could make the earth's atmosphere unfit for human life, or that we would one day be fighting wars primarily to maintain the flow of oil required to keep them running.

Digital technology conveys our souls (Kittler) as boundaries of perceptual and conceptual apparatus (Clark, Hayles).

(138-139) Digital technology doesn't merely convey our bodies, but ourselves. . . . They are fast becoming the boundaries of our perceptual and conceptual apparatus; the edge between our nervous systems and everyone else's, our understanding of the world and the world itself.

Socratic question at the core of program or be programmed; programming is the sweet spot like dialectic was to the ancients when realizing bias introduced by literacy.

(139) Our senses and our thoughts are already clouded by our own misperceptions, prejudices, and confusion. Our digital tools add yet another layer of bias on top of that. . . . Programming is the sweet spot, the high leverage point in a digital society. If we don't learn to program, we risk being programmed ourselves.

Short window of opportunity in late 1970s and early 1980s America as golden age of learning programming.

(139-140) Back in the 1970s, when computers were supposedly harder to use, there was no difference between operating a computer and programming one. Better public schools offered computer classes starting in the sixth or seventh grade, usually as an elective in the math department. Those of us lucky to grow up during that short window of opportunity learned to think of computers as “anything machines.”

Understanding programming helps transform mystery to science, hacker bias promotes questioning default social organizations.

(140) I'm sure only one or two of us actually graduated to become professional programmers, but that wasn't the point. All of us came to understand what programming is, how programmers make decisions, and how those decisions influence the ways the software and its users function. For us, as the mystery of computers became the science of programming, many other mysteries seemed to vanish as well. For the person who understands code, the whole world reveals itself as a series of decisions made by planners and designers for how the rest of us should live. . . . Once the biases become apparent, anything becomes possible. The world and its many arbitrary systems can be hacked.

Opacity of interfaces putatively designed for user friendliness bury real workings of the machines; Rushkoff proposes the transformation intentional because hacker ethic bad for business, leaving the work to professionals.

(141) So the people investing in software and hardware development sought to discourage this hacker's bias by making interfaces more complex. The idea was to turn the highly transparent medium of computing into a more opaque one, like television. Interfaces got thicker and more supposedly “user friendly” while the real workings of the machine got buried further in the background.
(142) Better to buy a locked-down and locked-up device, and then just trust the company we bought it from to take care of us. Like it used to say on the back of the TV set:
Hazard of electric shock. No user serviceable parts inside. Computing and programming were to be entrusted to professionals.

Striation resulting from advertising and lobbying to depend on out-of-the-box technology solutions.

(142-143) Of course none of this is really true. And the only way you'd really know this is if you understood programming. . . . Even the Pentagon is discouraged from developing its own security protocols through the Linux platform, by a Congress heavily lobbied to promote Windows.
(143) Like the military, we are to think of our technologies in terms of the applications they offer right out of the box instead of how we might change them or write our own.

Stages of historical human comportment to media development from player to cheater to modder to programmer.
(144) The more open it is to modification, the more consistent software becomes with the social bias of digital media.
(144) These stages of development—from player to cheater to modder to programmer—mirror our own developing relationship to media through the ages. In preliterate civilizations, people attempted to live their lives and appease their gods with no real sense of the rules. . . . The invention of text gave them a set of rules to follow—or not. . . . Martin Luther posted his ninety-five theses, the first great “mod” of Catholicism, and later, nations rewrote their histories by launching their revolutions.

Virtual life instantiates living writing.

(144-145) Finally, the invention of digital technology gives us the ability to program: to create self-sustaining information systems, or virtual life. These are technologies that carry on long after we've created them, making future decisions without us. . . . Programming in a digital age means determining the codes and rules through which our many technologies will build the future—or at least how they will start out.

We remain a dimensional leap behind current age of media technology.

(145) We have remained one dimensional leap behind the technology on offer.
(145-146) Finally, we have the tools to program. Yet we are content to seize only the capability of the last great media renaissance, that of writing. We feel proud to build a web page or finish our profile on a social networking site, as if this means we are now full-fledged participants in the cyber era.

Disinterest enough for technology leaders to maintain their monopolies, perhaps because the small people spend so much of their psychic energy manipulating user interfaces from social networks to automobiles.

(146-147) Yet again, we have surrendered the unfolding of a new technological age to a small elite who have seized the capability on offer. But while Renaissance kings maintained their monopoly over the printing presses by force, today's elite is depending on little more than our own disinterest.

Diminishing capabilities of Americans, increasing dependence on machines and other societies.

(147) The biases of the digital age will not just be those of the people who programmed it, but of the programs, machines, and life-forms they have unleashed. In the short term, we are looking at a society increasingly dependent on machines, yet decreasingly capable of making or even using them effectively. Other societies, such as China, where programming is more valued, seem destined to surpass us—unless, of course, the other forms of cultural repression in force there offset their progress as technologists.

Big picture is conscious, collective intervention in human evolution; compare to Lyotard inhuman.

(148) In the long term, if we take up this challenge, we are looking at nothing less than the conscious, collective intervention of human beings in their own evolution. It's the opportunity of a civilization's lifetime. Shouldn't more of us want to participate actively in this project?

Tools will behave more humanely the more humans are involved in their design.

(149) the more humans become involved in their design, the more humanely inspired these tools will end up behaving.

Nod to noticing now get on with learning working code PHI thinking about applying first exam question to beginning of second chapter.

Everyone must learn in order to contend with biases of digital technologies, even if we do not learn to program.

(149) Even if we don't all go out and learn to program—something any high school student can do with a decent paperback on the subject and a couple of weeks of effort—we must at least learn and contend with the essential biases of the technologies we will be living and working with from here on.



Rushkoff, Douglas. Program or Be Programmed: Ten Commands for a Digital Age. Berkeley, CA: Soft Skull Press, 2010.