http://chronicle.com/weekly/v50/i21/21b02601.htm
By SHERRY TURKLE
The tools we use to think change the ways in which we
think. The invention of written language brought about a radical shift in how we
process, organize, store, and transmit representations of the world. Although
writing remains our primary information technology, today when we think about
the impact of technology on our habits of mind, we think primarily of the
computer.
My first encounters with how computers change the way we think came soon after I
joined the faculty at the Massachusetts Institute of Technology in the late
1970s, at the end of the era of the slide rule and the beginning of the era of
the personal computer. At a lunch for new faculty members, several senior
professors in engineering complained that the transition from slide rules to
calculators had affected their students' ability to deal with issues of scale.
When students used slide rules, they had to insert decimal points themselves.
The professors insisted that that required students to maintain a mental sense
of scale, whereas those who relied on calculators made frequent errors in orders
of magnitude. Additionally, the students with calculators had lost their ability
to do "back of the envelope" calculations, and with that, an intuitive feel for
the material.
That same semester, I taught a course in the history of psychology. There, I
experienced the impact of computational objects on students' ideas about their
emotional lives. My class had read Freud's essay on slips of the tongue, with
its famous first example: The chairman of a parliamentary session opens a
meeting by declaring it closed. The students discussed how Freud interpreted
such errors as revealing a person's mixed emotions. A computer-science major
disagreed with Freud's approach. The mind, she argued, is a computer. And in a
computational dictionary -- like we have in the human mind -- "closed" and
"open" are designated by the same symbol, separated by a sign for opposition.
"Closed" equals "minus open." To substitute "closed" for "open" does not require
the notion of ambivalence or conflict.
"When the chairman made that substitution," she declared, "a bit was dropped; a
minus sign was lost. There was a power surge. No problem."
The young woman turned a Freudian slip into an information-processing error. An
explanation in terms of meaning had become an explanation in terms of mechanism.
Such encounters turned me to the study of both the instrumental and the
subjective sides of the nascent computer culture. As an ethnographer and
psychologist, I began to study not only what the computer was doing for
us, but what it was doing to us, including how it was changing the way we
see ourselves, our sense of human identity.
In the 1980s, I surveyed the psychological effects of computational objects in
everyday life -- largely the unintended side effects of people's tendency to
project thoughts and feelings onto their machines. In the 20 years since,
computational objects have become more explicitly designed to have emotional and
cognitive effects. And those "effects by design" will become even stronger in
the decade to come. Machines are being designed to serve explicitly as
companions, pets, and tutors. And they are introduced in school settings for the
youngest children.
Today, starting in elementary school, students use e-mail, word processing,
computer simulations, virtual communities, and PowerPoint software. In the
process, they are absorbing more than the content of what appears on their
screens. They are learning new ways to think about what it means to know and
understand.
What follows is a short and certainly not comprehensive list of areas where I
see information technology encouraging changes in thinking. There can be no
simple way of cataloging whether any particular change is good or bad. That is
contested terrain. At every step we have to ask, as educators and citizens,
whether current technology is leading us in directions that serve our human
purposes. Such questions are not technical; they are social, moral, and
political. For me, addressing that subjective side of computation is one of the
more significant challenges for the next decade of information technology in
higher education. Technology does not determine change, but it encourages us to
take certain directions. If we make those directions clear, we can more easily
exert human choice.
Thinking about privacy. Today's college students are habituated to a
world of online blogging, instant messaging, and Web browsing that leaves
electronic traces. Yet they have had little experience with the right to
privacy. Unlike past generations of Americans, who grew up with the notion that
the privacy of their mail was sacrosanct, our children are accustomed to
electronic surveillance as part of their daily lives.
I have colleagues who feel that the increased incursions on privacy have put the
topic more in the news, and that this is a positive change. But middle-school
and high-school students tend to be willing to provide personal information
online with no safeguards, and college students seem uninterested in violations
of privacy and in increased governmental and commercial surveillance. Professors
find that students do not understand that in a democracy, privacy is a right,
not merely a privilege. In 10 years, ideas about the relationship of privacy and
government will require even more active pedagogy. (One might also hope that
increased education about the kinds of silent surveillance that technology makes
possible may inspire more active political engagement with the issue.)
Avatars or a self? Chat rooms, role-playing games, and other
technological venues offer us many different contexts for presenting ourselves
online. Those possibilities are particularly important for adolescents because
they offer what Erik Erikson described as a moratorium, a time out or safe space
for the personal experimentation that is so crucial for adolescent development.
Our dangerous world -- with crime, terrorism, drugs, and AIDS -- offers little
in the way of safe spaces. Online worlds can provide valuable spaces for
identity play.
But some people who gain fluency in expressing multiple aspects of self may find
it harder to develop authentic selves. Some children who write narratives for
their screen avatars may grow up with too little experience of how to share
their real feelings with other people. For those who are lonely yet afraid of
intimacy, information technology has made it possible to have the illusion of
companionship without the demands of friendship.
From powerful ideas to PowerPoint. In the 1970s and early 1980s, some
educators wanted to make programming part of the regular curriculum for K-12
education. They argued that because information technology carries ideas, it
might as well carry the most powerful ideas that computer science has to offer.
It is ironic that in most elementary schools today, the ideas being carried by
information technology are not ideas from computer science like procedural
thinking, but more likely to be those embedded in productivity tools like
PowerPoint presentation software.
PowerPoint does more than provide a way of transmitting content. It carries its
own way of thinking, its own aesthetic -- which not surprisingly shows up in the
aesthetic of college freshmen. In that aesthetic, presentation becomes its own
powerful idea.
To be sure, the software cannot be blamed for lower intellectual standards.
Misuse of the former is as much a symptom as a cause of the latter. Indeed, the
culture in which our children are raised is increasingly a culture of
presentation, a corporate culture in which appearance is often more important
than reality. In contemporary political discourse, the bar has also been
lowered. Use of rhetorical devices at the expense of cogent argument regularly
goes without notice. But it is precisely because standards of intellectual rigor
outside the educational sphere have fallen that educators must attend to how we
use, and when we introduce, software that has been designed to simplify the
organization and processing of information.
In "The Cognitive Style of PowerPoint" (Graphics Press, 2003), Edward R. Tufte
suggests that PowerPoint equates bulleting with clear thinking. It does not
teach students to begin a discussion or construct a narrative. It encourages
presentation, not conversation. Of course, in the hands of a master teacher, a
PowerPoint presentation with few words and powerful images can serve as the
jumping-off point for a brilliant lecture. But in the hands of elementary-school
students, often introduced to PowerPoint in the third grade, and often
infatuated with its swooshing sounds, animated icons, and flashing text, a slide
show is more likely to close down debate than open it up.
Developed to serve the needs of the corporate boardroom, the software is
designed to convey absolute authority. Teachers used to tell students that clear
exposition depended on clear outlining, but presentation software has fetishized
the outline at the expense of the content.
Narrative, the exposition of content, takes time. PowerPoint, like so much in
the computer culture, speeds up the pace.
Word processing vs. thinking. The catalog for the Vermont Country Store
advertises a manual typewriter, which the advertising copy says "moves at a pace
that allows time to compose your thoughts." As many of us know, it is possible
to manipulate text on a computer screen and see how it looks faster than we can
think about what the words mean.
Word processing has its own complex psychology. From a pedagogical point of
view, it can make dedicated students into better writers because it allows them
to revise text, rearrange paragraphs, and experiment with the tone and shape of
an essay. Few professional writers would part with their computers; some claim
that they simply cannot think without their hands on the keyboard. Yet the
ability to quickly fill the page, to see it before you can think it, can make
bad writers even worse.
A seventh grader once told me that the typewriter she found in her mother's
attic is "cool because you have to type each letter by itself. You have to know
what you are doing in advance or it comes out a mess." The idea of thinking
ahead has become exotic.
Taking things at interface value. We expect software to be easy to use,
and we assume that we don't have to know how a computer works. In the early
1980s, most computer users who spoke of transparency meant that, as with any
other machine, you could "open the hood" and poke around. But only a few years
later, Macintosh users began to use the term when they talked about seeing their
documents and programs represented by attractive and easy-to-interpret icons.
They were referring to an ability to make things work without needing to go
below the screen surface. Paradoxically, it was the screen's opacity that
permitted that kind of transparency. Today, when people say that something is
transparent, they mean that they can see how to make it work, not that they know
how it works. In other words, transparency means epistemic opacity.
The people who built or bought the first generation of personal computers
understood them down to the bits and bytes. The next generation of operating
systems were more complex, but they still invited that old-time reductive
understanding. Contemporary information technology encourages different habits
of mind. Today's college students are already used to taking things at (inter)
face value; their successors in 2014 will be even less accustomed to probing
below the surface.
Simulation and its discontents. Some thinkers argue that the new opacity
is empowering, enabling anyone to use the most sophisticated technological tools
and to experiment with simulation in complex and creative ways. But it is also
true that our tools carry the message that they are beyond our understanding. It
is possible that in daily life, epistemic opacity can lead to passivity.
I first became aware of that possibility in the early 1990s, when the first
generation of complex simulation games were introduced and immediately became
popular for home as well as school use. SimLife teaches the principles of
evolution by getting children involved in the development of complex ecosystems;
in that sense it is an extraordinary learning tool. During one session in which
I played SimLife with Tim, a 13-year-old, the screen before us flashed a
message: "Your orgot is being eaten up." "What's an orgot?" I asked. Tim didn't
know. "I just ignore that," he said confidently. "You don't need to know that
kind of stuff to play."
For me, that story serves as a cautionary tale. Computer simulations enable
their users to think about complex phenomena as dynamic, evolving systems. But
they also accustom us to manipulating systems whose core assumptions we may not
understand and that may not be true.
We live in a culture of simulation. Our games, our economic and political
systems, and the ways architects design buildings, chemists envisage molecules,
and surgeons perform operations all use simulation technology. In 10 years the
degree to which simulations are embedded in every area of life will have
increased exponentially. We need to develop a new form of media literacy:
readership skills for the culture of simulation.
We come to written text with habits of readership based on centuries of
civilization. At the very least, we have learned to begin with the journalist's
traditional questions: who, what, when, where, why, and how. Who wrote these
words, what is their message, why were they written, and how are they situated
in time and place, politically and socially? A central project for higher
education during the next 10 years should be creating programs in
information-technology literacy, with the goal of teaching students to
interrogate simulations in much the same spirit, challenging their built-in
assumptions.
Despite the ever-increasing complexity of software, most computer environments
put users in worlds based on constrained choices. In other words, immersion in
programmed worlds puts us in reassuring environments where the rules are clear.
For example, when you play a video game, you often go through a series of
frightening situations that you escape by mastering the rules -- you experience
life as a reassuring dichotomy of scary and safe. Children grow up in a culture
of video games, action films, fantasy epics, and computer programs that all rely
on that familiar scenario of almost losing but then regaining total mastery:
There is danger. It is mastered. A still-more-powerful monster appears. It is
subdued. Scary. Safe.
Yet in the real world, we have never had a greater need to work our way out of
binary assumptions. In the decade ahead, we need to rebuild the culture around
information technology. In that new sociotechnical culture, assumptions about
the nature of mastery would be less absolute. The new culture would make it
easier, not more difficult, to consider life in shades of gray, to see moral
dilemmas in terms other than a battle between Good and Evil. For never has our
world been more complex, hybridized, and global. Never have we so needed to have
many contradictory thoughts and feelings at the same time. Our tools must help
us accomplish that, not fight against us.
Information technology is identity technology. Embedding it in a culture that
supports democracy, freedom of expression, tolerance, diversity, and complexity
of opinion is one of the next decade's greatest challenges. We cannot afford to
fail.
When I first began studying the computer culture, a small breed of highly
trained technologists thought of themselves as "computer people." That is no
longer the case. If we take the computer as a carrier of a way of knowing, a way
of seeing the world and our place in it, we are all computer people now.
Sherry Turkle is a professor of the social studies of science and technology
at the Massachusetts Institute of Technology.
http://chronicle.com
Section: The Chronicle Review
Volume 50, Issue 21, Page B26
Copyright © 2004 by The Chronicle of Higher Education