By Jacob Mikanowski
Prospect Magazine Published April 24, 2014
I used to ask the internet everything. I started young. In the late 1980s, my family got its first modem. My father was a computer scientist, and he used it to access his computer at work. It was a silver box the size of a book; I liked its little red lights that told you when it was on and communicating with the world. Before long, I was logging onto message boards to ask questions about telescopes and fossils and plots of science fiction TV shows.
I kept at it for years, buying new hardware, switching browsers and search engines as needed. And then, around 2004, I stopped. Social media swallowed my friends whole, and I wanted no part of it. Friendster and Myspace and Facebook—the first great wave of social networking sites—all felt too invasive and too personal. I didn’t want to share, and I didn’t want to be seen.
So now, 10 years on, Facebook, iMessaging, and Twitter have passed me by. It’s become hard to keep up with people. I get all my news—weddings, moves, births, deaths—second-hand, from people who saw something on someone else’s feed. I never know what’s going on. In return, I have the vain satisfaction of feeling like the last real human being in a world of pods. But I am left wondering: what am I missing out on? And is everyone else missing out on something I still have?
Virginia Woolf famously said that on or about December 1910 human character changed. We don’t yet know if the same thing happened with the release of the iPhone 5—but, as the digital and “real” worlds become harder to distinguish from each other, it seems clear that something is shifting. The ways we interact with each other and with the world have altered. Yet the writing on this subject—whether it’s by social scientists, novelists or self-styled “internet intellectuals”—still doesn’t seem to have registered the full import of this transformation.
Speculation about the impact of technology on our present and future runs toward the extreme and the contradictory. Techno-utopians clash with techno-sceptics; extravagant claims of human perfectibility lock horns with dismal tales of cultural decline. This oscillation between boosterism and gloom is evident from a glance at some of the titles of influential books published over recent years: Everything Bad is Good for You: How Today’s Popular Culture is Actually Making Us Smarter; Big Data: A Revolution That Will Transform How We Live, Work, and Think; The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future.
Now we have three new books by academics—The App Generation, Status Update, and It’s Complicated (all published by Yale University Press)—each of which tries to bring some much-needed clarity to the discussion. These three books strive for a more balanced approach. Each comes at the question of what the internet is doing to our minds and social lives from an empirical, social-scientific point of view. But while they mostly avoid the extremes of jubilation and despair indulged in by so many other writers on the subject, all of them are limited by their own parameters as academic studies. They raise big questions while proposing partial, cautious answers. With the exception of It’s Complicated—whose title signals its more nuanced approach—the books fail to draw big conclusions.
The App Generation was written by two professors of education: Howard Gardner of Harvard, famous for his theory of multiple intelligences (verbal, visual, bodily-kinesthetic), and his student Katie Davis, now at the University of Washington. To discover what the internet is doing to human consciousness and society, they have studied its effect on children. They begin with an assertion: that today’s young people are so immersed in technology “they’ve come to think of the world as an ensemble of apps.”
What does this assertion mean? It’s never quite clear. The App Generation doesn’t dwell on which particular apps are shaping children in their image. As the authors define it, an app is any software program that runs on a smartphone. Apps also provide an overarching metaphor for many other aspects of life. Prayer is an app because it only works if “carried out according to… specified procedures.” Religion is a “super-app.” Of course, prayer and religion are not downloadable; they are “human choreographed,” as the authors put it. Yet young people, the argument goes, have come to understand these things as formally and functionally analogous to genuine apps such as RunKeeper (which tracks your fitness) and TonePad (which allows you to make music on your iPhone). Gardner and Davis’s line of thinking lends itself to dire pronouncements—today’s youth inhabit “app consciousness, an app worldview”—and to terrible puns: “Could just the right ensemble of apps lead to a wholly ‘hAPPy’ life?”
To Gardner and Davis, apps are essentially shortcuts to everything. By putting a solution to every problem in the palm of their hand, apps make children intellectually lazy. Worse, the children who rely on them to navigate the world turn out to be superficial narcissists and risk-averse people pleasers. They’re also increasingly lacking in one of the most important elements of childhood: imagination. Much of The App Generation attempts to chart the decline of American creativity. In order to measure it quantitatively, Gardner and Davis conducted an experiment in art criticism and literary theory. Analysing 20-years’ worth of artwork and short stories by teenagers, they classified each drawing as having a composition that was “conservative,” “neutral” or “unconventional.” Along the same lines, they designated each story as having a plot that was “everyday,” “not everyday” or the tantalising “everyday with a twist.” They found that since the early 1990s, the proportion of unconventional art increased by a modest 9 per cent, while the share of stories set in the fantastical world of the not everyday showed a steep drop from 64 to 14 per cent. Gardner and Davis conclude that, for the app-dependent, creativity itself is becoming a matter of “remixing,” rather than invention.
It’s hard to lend much credence to Gardner and Davis’s numbers. Their sample size is small, and their parameters for classifying art hopelessly vague. But in the process of setting out their findings, they raise important questions: what is what they’re calling “the app generation”—the young people who have never lived without the internet, without smartphones—actually like?
The behaviour of teens online can be baffling. But are they really more “risk-averse,” “dependent,” “superficial” and “narcissistic” than kids in the past? And are they in danger in some new, hard-to-track way? Danah Boyd, a researcher at New York University and Microsoft, isn’t so sure. In It’s Complicated, her detailed new anthropological inquiry into the internet habits of American teenagers, she does much to dispel many of the alarmist myths that surround young people and social media.
Boyd has spent over a decade interviewing teens about their use of social media, and in the process has developed a nuanced feel for how they live their online lives. Throughout It’s Complicated, she shows teens to be gifted at alternating between different languages and modes of self-presentation, assuming different personas for different audiences and switching platforms (say, between Facebook and Twitter and Ask.fm) based on their individual interests and levels of privacy. She also suggests that many of the fears associated with teens and the internet—from bullying to addiction—are overblown. She argues convincingly, for instance, that “Social media has not radically altered the dynamics of bullying, but it has made these dynamics more visible to more people.”
Social media may not lead to more bullying or addiction, but it does create lots of drama. Boyd and her sometime-collaborator Alice Marwick define drama as “performative, interpersonal conflict that takes place in front of an active, engaged audience, often on social media.” Essentially, “drama” is what keeps school from being boring, and what makes it such hell. It’s also the reason teenagers spend so much time online. The lure isn’t technology itself, or the utopian dream of a space in which anyone could become anything, which drew many young people to the internet in its early bulletin-board and newsgroup days; it’s socialising. Teens go online to “be with friends on their own terms, without adult supervision, and in public”—and Boyd argues that this is now much more difficult than it used to be. She portrays the US as a place in which teens are barred from public spaces such as parks and malls, and face constant monitoring from parents, teachers and the state. This is a paranoid country, in which parents try to channel all their children’s free time into structured activities and are so afraid of predators that they don’t allow their children outside alone. In this “culture of fear” social media affords teens one of their few avenues for autonomous expression.
Parents never understand; but Boyd makes the case that adult cluelessness about the multiple uses teens find for social media—everything from sharing jokes to showing off for university recruiters—can be especially harmful now. She tells the story of a teenager from south central Los Angeles who writes an inspiring college entrance essay about his desire to escape his gang-ridden neighbourhood. But when admissions officers at the Ivy League university to which he’s applying Google him, they are shocked to discover that his MySpace profile is filled with gang symbolism and references to gang activities. They do not consider that this might be a survival strategy instead of a case of outright deception. In a similar fashion, admissions officers at other universities who use Facebook to recruit students inadvertently neglect the millions of black and Latino youth who are (or at least, were a few years ago) more likely to use Myspace.
Who are the people designing these all-pervasive technologies? In Status Update, Alice Marwick focuses on the world of tech-bloggers and internet celebrities who came to San Francisco to seek their fortunes in the digital gold rush of the mid-2000s.
It is an incisive portrait of a local culture that is rapidly becoming global: one in which attention equals success, fortune favours the self-aggrandising and luck is always mistaken for destiny. Marwick’s subjects include marketing gurus, “startup coaches” and “happiness engineers” whose work is closely followed by entrepreneurs and aspiring tech moguls in the bubble of Silicon Valley. These are people whose businesses are essentially themselves, which requires them to engage in relentless self-promotion through all the new tools social media affords.
As a result, they’re obsessed with status, which they measure in attention. The “micro-celebrities” Marwick follows are cliquish, backbiting and occasionally nasty. (Certainly their manners need work; one tech blogger bellows at Marwick: “I’m not speaking to you anymore—you stopped following me on Twitter!”) They stream their lives online and work tirelessly on their personal brands. For guidance, they turn to the advice of dubious self-help gurus like Gary Vaynerchuk, who instructs his charges to monetise every moment of their lives by blogging their passion, and hyperkinetic weirdoes like Tim Ferris, who has made a fortune by advising readers to “hack” their diets and outsource all their admin to India in pursuit of the elusive four-hour work week. (Another of his books is the modestly titled The 4-Hour Body: An uncommon guide to rapid fat-loss, incredible sex and becoming superhuman.)
To Marwick, now a professor of communication and media studies at Fordham University, this preoccupation with accruing more and more Twitter followers, Facebook likes and conference invitations among the “numerati” belies the democratic promise of the internet as an egalitarian space. But Marwick never fully explains why the activities of this really quite narrow social sphere (I live only a short drive across the San Francisco Bay from them, and barely have an idea who most of her protagonists are) should be a betrayal of a utopian ideal from 20 years ago. Moreover, her attempts to link their activities to larger forces, from the spread of free-market ideology to the post-Enlightenment rise of oppressive “technologies of the self,” feel forced. Much of the book comes off as a series of anecdotes in search of a thesis.
Like an anthropologist, Marwick immersed herself in a milieu and made it her own. But in the process she may have lost some perspective. After reading Status Update, I had the same feeling as after completing It’s Complicated and The App Generation—that I was no closer to understanding the substance of the change wrought by social media on our lives. We need more writers thinking deeply about the way the internet reorders our experience of everyday life. Not just the ways it makes tasks easier or changes the way we socialise and communicate with one another, but the way it shapes our wants, our fears, our way of thinking and talking.
One thing none of the three authors address in depth is the way the internet has altered modes of communication. Some linguists have already examined the way texting is changing the syntax and orthography of writing (David Crystal’s Txtng: the Gr8 Db8 comes to mind), but little has been done yet to analyse the newer online shift from text to image. Social media and blogging platforms like Tumblr are increasingly making it possible to conduct whole conversations entirely in pictures. And, to judge from current usage, it seems that photographs from Breaking Bad and animated gifs of shuffling octopuses are proving popular alternatives to text as a means to convey human emotions. Rather than make fun of this trend, it would be instructive to see someone approach this new image-language with the gravity of the philosopher JL Austin, who, when he analysed ordinary language, found it to be, in the critic Dave Hickey’s words, a more “subtle, delicate and resourceful instrument than the scholastic, philosophical language of his day.” At the very least it would be interesting. (And who knows? Maybe the next Tractatus will be written in screencaps from Dr. Who.)
And what about our changing perceptions of time and space? In The App Generation, Katie Davis remarks that her younger sister has never had the experience of being lost, and probably never will, unless she loses her phone. What does never getting lost do to someone’s experience of the world? With GPS everywhere, is a forest still a forest or is it just a collection of trees? And how many other states of being are vanishing? Boyd (refreshingly) insists that “the kids are alright”—but her book also suggests that they are never really alone. Are boredom, solitude and aimlessness on their way out, too?
We need someone to do for the internet what the French phenomenologist Gaston Bachelard did for architecture. In The Poetics of Space, Bachelard explained the way our different metaphors of enclosure—houses, corners, shells, and nests—mimic and shape the way we imagine ourselves to be in the world. For Bachelard, images came before thoughts. They were the way the mind represented the world to itself. No social scientist is going to go down this road, but someone should—a philosopher, or a poet, or a novelist—especially because technology is threatening to make extinct many once-commonplace types of experience.
These rapidly vanishing experiences matter—to philosophy and literature, above all. For Martin Heidegger, the feeling of profound boredom—which he felt while waiting for a train at a provincial train station, for instance—brought one closest to the kind of active attention that separates human beings from animals. Boredom drove Michel de Montaigne to write his self-reflections. And once he started, he needed absolute privacy and silence to continue. He found them in the corner of his library, the sort of corner Bachelard spoke of when he wrote: “When we recall the hours we have spent in our corners, we remember above all silence, the silence of our thoughts.”
The kind of experience of selfhood that Bachelard and Montaigne describe is one of interiority—discovering one’s self through introspection. We almost take that equation as a given, but it wasn’t always. The search for the origins of the modern self has been one of the great snipe hunts in the history of the humanities. For Jacob Burckhardt it began in the Italian Renaissance; for Norbert Elias in the court of Louis XIV; for Harold Bloom it all started with Shakespeare. But the point is the same—for a long time the self was one way, and then it was another. As Lionel Trilling put it in Sincerity and Authenticity, “in the late 16th and early 17th centuries, something like a mutation in human nature took place.” Medieval people defined themselves as members of groups, and then, suddenly, they became Renaissance “individuals.” The change registers in poetry, in painting, in philosophy. You can hear it in Hamlet’s soliloquies and see it in Italian portraiture—starting around 1500, when these people look at you, they hold something back. They live inside their heads.
But do we anymore? Maybe it’s time to start asking questions about exteriority; to switch from the corner to the phone, from the introspective essay to the online profile. Starting some 500 years ago, the self was understood as an enclosure. It was something that required silence to access and space to experience. I think that used to be true. It probably still is. But it might not be for very much longer.
This was posted 2 months ago. It has 96 notes.