Полная версия
Trick Mirror
TRICK MIRROR
REFLECTIONS ON SELF-DELUSION
Jia Tolentino
Copyright
4th Estate
An imprint of HarperCollinsPublishers
1 London Bridge Street
London SE1 9GF
www.4thEstate.co.uk
This eBook first published in Great Britain by 4th Estate in 2019
First published in the United States by Random House in 2019
Copyright © Jia Tolentino 2019
Jia Tolentino asserts the moral right to be identified as the author of this work
A catalogue record for this book is available from the British Library
All rights reserved under International and Pan-American Copyright Conventions. By payment of the required fees, you have been granted the non-exclusive, non-transferable right to access and read the text of this e-book on-screen. No part of this text may be reproduced, transmitted, down-loaded, decompiled, reverse engineered, or stored in or introduced into any information storage and retrieval system, in any form or by any means, whether electronic or mechanical, now known or hereinafter invented, without the express written permission of HarperCollins
Source ISBN: 9780008294922
Ebook Edition © August 2019 ISBN: 9780008294946
Version: 2019-06-21
Dedication
For my parents
Contents
Cover
Title Page
Copyright
Dedication
Introduction
The I in the Internet
Reality TV Me
Always Be Optimizing
Pure Heroines
Ecstasy
The Story of a Generation in Seven Scams
We Come from Old Virginia
The Cult of the Difficult Woman
I Thee Dread
Acknowledgments
Background Reading
About the Author
About the Publisher
Introduction
I wrote this book between the spring of 2017 and the fall of 2018—a period during which American identity, culture, technology, politics, and discourse seemed to coalesce into an unbearable supernova of perpetually escalating conflict, a stretch of time when daily experience seemed both like a stopped elevator and an endless state-fair ride, when many of us regularly found ourselves thinking that everything had gotten as bad as we could possibly imagine, after which, of course, things always got worse.
Throughout this period, I found that I could hardly trust anything that I was thinking. A doubt that always hovers in the back of my mind intensified: that whatever conclusions I might reach about myself, my life, and my environment are just as likely to be diametrically wrong as they are to be right. This suspicion is hard for me to articulate closely, in part because I usually extinguish it by writing. When I feel confused about something, I write about it until I turn into the person who shows up on paper: a person who is plausibly trustworthy, intuitive, and clear.
It’s exactly this habit—or compulsion—that makes me suspect that I am fooling myself. If I were, in fact, the calm person who shows up on paper, why would I always need to hammer out a narrative that gets me there? I’ve been telling myself that I wrote this book because I was confused after the election, because confusion sits at odds to my temperament, because writing is my only strategy for making this conflict go away. I’m convinced by this story, even as I can see its photonegative: I wrote this book because I am always confused, because I can never be sure of anything, and because I am drawn to any mechanism that directs me away from that truth. Writing is either a way to shed my self-delusions or a way to develop them. A well-practiced, conclusive narrative is usually a dubious one: that a person is “not into drama,” or that America needs to be made great again, or that America is already great.
These essays are about the spheres of public imagination that have shaped my understanding of myself, of this country, and of this era. One is about the internet. Another is about “optimization,” and the rise of athleisure as late-capitalist fetishwear, and the endlessly proliferating applications of the idea that women’s bodies should increase their market performance over time. There’s an essay about drugs and religion and the bridge that ecstasy forms between them; another about scamming as the definitive millennial ethos; another about the literary heroine’s journey from brave girl to depressed teenager to bitter adult woman who’s possibly dead. One essay is about my stint as a teenage reality TV contestant. One is about sex and race and power at the University of Virginia, my alma mater, where a series of convincing stories have exacted enormous hidden costs. The final two are about the feminist obsession with “difficult” women and about the slow-burning insanity that I aquired in my twenties while attending what felt like several thousand weddings per year. These are the prisms through which I have come to know myself. In this book, I tried to undo their acts of refraction. I wanted to see the way I would see in a mirror. It’s possible I painted an elaborate mural instead.
But that’s fine. The last few years have taught me to suspend my desire for a conclusion, to assume that nothing is static and that renegotiation will be perpetual, to hope primarily that little truths will keep emerging in time. While I was writing this, a stranger tweeted an excerpt of a Jezebel piece I wrote in 2015, highlighting a sentence about what women seemed to want from feminist websites—a “trick mirror that carries the illusion of flawlessness as well as the self-flagellating option of constantly finding fault.” I had not remembered using that phrase when I came up with a book title, and I had not understood, when I was writing that Jezebel piece, that that line was also an explanation of something more personal. I began to realize that all my life I’ve been leaving myself breadcrumbs. It didn’t matter that I didn’t always know what I was walking toward. It was worthwhile, I told myself, just trying to see clearly, even if it took me years to understand what I was trying to see.
The I in the Internet
In the beginning the internet seemed good. “I was in love with the internet the first time I used it at my dad’s office and thought it was the ULTIMATE COOL,” I wrote, when I was ten, on an Angelfire subpage titled “The Story of How Jia Got Her Web Addiction.” In a text box superimposed on a hideous violet background, I continued:
But that was in third grade and all I was doing was going to Beanie Baby sites. Having an old, icky bicky computer at home, we didn’t have the Internet. Even AOL seemed like a far-off dream. Then we got a new top-o’-the-line computer in spring break ’99, and of course it came with all that demo stuff. So I finally had AOL and I was completely amazed at the marvel of having a profile and chatting and IMS!!
Then, I wrote, I discovered personal webpages. (“I was astonished!”) I learned HTML and “little Javascript trickies.” I built my own site on the beginner-hosting site Expage, choosing pastel colors and then switching to a “starry night theme.” Then I ran out of space, so I “decided to move to Angelfire. Wow.” I learned how to make my own graphics. “This was all in the course of four months,” I wrote, marveling at how quickly my ten-year-old internet citizenry was evolving. I had recently revisited the sites that had once inspired me, and realized “how much of an idiot I was to be wowed by that.”
I have no memory of inadvertently starting this essay two decades ago, or of making this Angelfire subpage, which I found while hunting for early traces of myself on the internet. It’s now eroded to its skeleton: its landing page, titled “THE VERY BEST,” features a sepia-toned photo of Andie from Dawson’s Creek and a dead link to a new site called “THE FROSTED FIELD,” which is “BETTER!” There’s a page dedicated to a blinking mouse GIF named Susie, and a “Cool Lyrics Page” with a scrolling banner and the lyrics to Smash Mouth’s “All Star,” Shania Twain’s “Man! I Feel Like a Woman!” and the TLC diss track “No Pigeons,” by Sporty Thievz. On an FAQ page—there was an FAQ page—I write that I had to close down my customizable cartoon-doll section, as “the response has been enormous.”
It appears that I built and used this Angelfire site over just a few months in 1999, immediately after my parents got a computer. My insane FAQ page specifies that the site was started in June, and a page titled “Journal”—which proclaims, “I am going to be completely honest about my life, although I won’t go too deeply into personal thoughts, though”—features entries only from October. One entry begins: “It’s so HOT outside and I can’t count the times acorns have fallen on my head, maybe from exhaustion.” Later on, I write, rather prophetically: “I’m going insane! I literally am addicted to the web!”
In 1999, it felt different to spend all day on the internet. This was true for everyone, not just for ten-year-olds: this was the You’ve Got Mail era, when it seemed that the very worst thing that could happen online was that you might fall in love with your business rival. Throughout the eighties and nineties, people had been gathering on the internet in open forums, drawn, like butterflies, to the puddles and blossoms of other people’s curiosity and expertise. Self-regulated newsgroups like Usenet cultivated lively and relatively civil discussion about space exploration, meteorology, recipes, rare albums. Users gave advice, answered questions, made friendships, and wondered what this new internet would become.
Because there were so few search engines and no centralized social platforms, discovery on the early internet took place mainly in private, and pleasure existed as its own solitary reward. A 1995 book called You Can Surf the Net! listed sites where you could read movie reviews or learn about martial arts. It urged readers to follow basic etiquette (don’t use all caps; don’t waste other people’s expensive bandwidth with overly long posts) and encouraged them to feel comfortable in this new world (“Don’t worry,” the author advised. “You have to really mess up to get flamed.”). Around this time, GeoCities began offering personal website hosting for dads who wanted to put up their own golfing sites or kids who built glittery, blinking shrines to Tolkien or Ricky Martin or unicorns, most capped off with a primitive guest book and a green-and-black visitor counter. GeoCities, like the internet itself, was clumsy, ugly, only half functional, and organized into neighborhoods: /area51/ was for sci-fi, /westhollywood/ for LGBTQ life, /enchantedforest/ for children, /petsburgh/ for pets. If you left GeoCities, you could walk around other streets in this ever-expanding village of curiosities. You could stroll through Expage or Angelfire, as I did, and pause on the thoroughfare where the tiny cartoon hamsters danced. There was an emergent aesthetic—blinking text, crude animation. If you found something you liked, if you wanted to spend more time in any of these neighborhoods, you could build your own house from HTML frames and start decorating.
This period of the internet has been labeled Web 1.0—a name that works backward from the term Web 2.0, which was coined by the writer and user-experience designer Darcy DiNucci in an article called “Fragmented Future,” published in 1999. “The Web we know now,” she wrote, “which loads into a browser window in essentially static screenfuls, is only an embryo of the Web to come. The first glimmerings of Web 2.0 are beginning to appear … The Web will be understood not as screenfuls of texts and graphics but as a transport mechanism, the ether through which interactivity happens.” On Web 2.0, the structures would be dynamic, she predicted: instead of houses, websites would be portals, through which an ever-changing stream of activity—status updates, photos—could be displayed. What you did on the internet would become intertwined with what everyone else did, and the things other people liked would become the things that you would see. Web 2.0 platforms like Blogger and Myspace made it possible for people who had merely been taking in the sights to start generating their own personalized and constantly changing scenery. As more people began to register their existence digitally, a pastime turned into an imperative: you had to register yourself digitally to exist.
In a New Yorker piece from November 2000, Rebecca Mead profiled Meg Hourihan, an early blogger who went by Megnut. In just the prior eighteen months, Mead observed, the number of “weblogs” had gone from fifty to several thousand, and blogs like Megnut were drawing thousands of visitors per day. This new internet was social (“a blog consists primarily of links to other Web sites and commentary about those links”) in a way that centered on individual identity (Megnut’s readers knew that she wished there were better fish tacos in San Francisco, and that she was a feminist, and that she was close with her mom). The blogosphere was also full of mutual transactions, which tended to echo and escalate. The “main audience for blogs is other bloggers,” Mead wrote. Etiquette required that, “if someone blogs your blog, you blog his blog back.”
Through the emergence of blogging, personal lives were becoming public domain, and social incentives—to be liked, to be seen—were becoming economic ones. The mechanisms of internet exposure began to seem like a viable foundation for a career. Hourihan cofounded Blogger with Evan Williams, who later cofounded Twitter. JenniCam, founded in 1996 when the college student Jennifer Ringley started broadcasting webcam photos from her dorm room, attracted at one point up to four million daily visitors, some of whom paid a subscription fee for quicker-loading images. The internet, in promising a potentially unlimited audience, began to seem like the natural home of self-expression. In one blog post, Megnut’s boyfriend, the blogger Jason Kottke, asked himself why he didn’t just write his thoughts down in private. “Somehow, that seems strange to me though,” he wrote. “The Web is the place for you to express your thoughts and feelings and such. To put those things elsewhere seems absurd.”
Every day, more people agreed with him. The call of self-expression turned the village of the internet into a city, which expanded at time-lapse speed, social connections bristling like neurons in every direction. At ten, I was clicking around a web ring to check out other Angelfire sites full of animal GIFs and Smash Mouth trivia. At twelve, I was writing five hundred words a day on a public LiveJournal. At fifteen, I was uploading photos of myself in a miniskirt on Myspace. By twenty-five, my job was to write things that would attract, ideally, a hundred thousand strangers per post. Now I’m thirty, and most of my life is inextricable from the internet, and its mazes of incessant forced connection—this feverish, electric, unlivable hell.
As with the transition between Web 1.0 and Web 2.0, the curdling of the social internet happened slowly and then all at once. The tipping point, I’d guess, was around 2012. People were losing excitement about the internet, starting to articulate a set of new truisms. Facebook had become tedious, trivial, exhausting. Instagram seemed better, but would soon reveal its underlying function as a three-ring circus of happiness and popularity and success. Twitter, for all its discursive promise, was where everyone tweeted complaints at airlines and bitched about articles that had been commissioned to make people bitch. The dream of a better, truer self on the internet was slipping away. Where we had once been free to be ourselves online, we were now chained to ourselves online, and this made us self-conscious. Platforms that promised connection began inducing mass alienation. The freedom promised by the internet started to seem like something whose greatest potential lay in the realm of misuse.
Even as we became increasingly sad and ugly on the internet, the mirage of the better online self continued to glimmer. As a medium, the internet is defined by a built-in performance incentive. In real life, you can walk around living life and be visible to other people. But you can’t just walk around and be visible on the internet—for anyone to see you, you have to act. You have to communicate in order to maintain an internet presence. And, because the internet’s central platforms are built around personal profiles, it can seem—first at a mechanical level, and later on as an encoded instinct—like the main purpose of this communication is to make yourself look good. Online reward mechanisms beg to substitute for offline ones, and then overtake them. This is why everyone tries to look so hot and well-traveled on Instagram; this is why everyone seems so smug and triumphant on Facebook; this is why, on Twitter, making a righteous political statement has come to seem, for many people, like a political good in itself.
This practice is often called “virtue signaling,” a term most often used by conservatives criticizing the left. But virtue signaling is a bipartisan, even apolitical action. Twitter is overrun with dramatic pledges of allegiance to the Second Amendment that function as intra-right virtue signaling, and it can be something like virtue signaling when people post the suicide hotline after a celebrity death. Few of us are totally immune to the practice, as it intersects with real desire for political integrity. Posting photos from a protest against border family separation, as I did while writing this, is a microscopically meaningful action, an expression of genuine principle, and also, inescapably, some sort of attempt to signal that I am good.
Taken to its extreme, virtue signaling has driven people on the left to some truly unhinged behavior. A legendary case occurred in June 2016, after a two-year-old was killed at a Disney resort—dragged off by an alligator while playing in a no-swimming-allowed lagoon. A woman, who had accumulated ten thousand Twitter followers with her posts about social justice, saw an opportunity and tweeted, magnificently, “I’m so finished with white men’s entitlement lately that I’m really not sad about a 2yo being eaten by a gator because his daddy ignored signs.” (She was then pilloried by people who chose to demonstrate their own moral superiority through mockery—as I am doing here, too.) A similar tweet made the rounds in early 2018 after a sweet story went viral: a large white seabird named Nigel had died next to the concrete decoy bird to whom he had devoted himself for years. An outraged writer tweeted, “Even concrete birds do not owe you affection, Nigel,” and wrote a long Facebook post arguing that Nigel’s courtship of the fake bird exemplified … rape culture. “I’m available to write the feminist perspective on Nigel the gannet’s non-tragic death should anyone wish to pay me,” she added, underneath the original tweet, which received more than a thousand likes. These deranged takes, and their unnerving proximity to online monetization, are case studies in the way that our world—digitally mediated, utterly consumed by capitalism—makes communication about morality very easy but makes actual moral living very hard. You don’t end up using a news story about a dead toddler as a peg for white entitlement without a society in which the discourse of righteousness occupies far more public attention than the conditions that necessitate righteousness in the first place.
On the right, the online performance of political identity has been even wilder. In 2017, the social-media-savvy youth conservative group Turning Point USA staged a protest at Kent State University featuring a student who put on a diaper to demonstrate that “safe spaces were for babies.” (It went viral, as intended, but not in the way TPUSA wanted—the protest was uniformly roasted, with one Twitter user slapping the logo of the porn site Brazzers on a photo of the diaper boy, and the Kent State TPUSA campus coordinator resigned.) It has also been infinitely more consequential, beginning in 2014, with a campaign that became a template for right-wing internet-political action, when a large group of young misogynists came together in the event now known as Gamergate.
The issue at hand was, ostensibly, a female game designer accused of sleeping with a journalist for favorable coverage. She, along with a set of feminist game critics and writers, received an onslaught of rape threats, death threats, and other forms of harassment, all concealed under the banner of free speech and “ethics in games journalism.” The Gamergaters—estimated by Deadspin to number around ten thousand people—would mostly deny this harassment, either parroting in bad faith or fooling themselves into believing the argument that Gamergate was actually about noble ideals. Gawker Media, Deadspin’s parent company, itself became a target, in part because of its own aggressive disdain toward the Gamergaters: the company lost seven figures in revenue after its advertisers were brought into the maelstrom.
In 2016, a similar fiasco made national news in Pizzagate, after a few rabid internet denizens decided they’d found coded messages about child sex slavery in the advertising of a pizza shop associated with Hillary Clinton’s campaign. This theory was disseminated all over the far-right internet, leading to an extended attack on DC’s Comet Ping Pong pizzeria and everyone associated with the restaurant—all in the name of combating pedophilia—that culminated in a man walking into Comet Ping Pong and firing a gun. (Later on, the same faction would jump to the defense of Roy Moore, the Republican nominee for the Senate who was accused of sexually assaulting teenagers.) The over-woke left could only dream of this ability to weaponize a sense of righteousness. Even the militant antifascist movement, known as antifa, is routinely disowned by liberal centrists, despite the fact that the antifa movement is rooted in a long European tradition of Nazi resistance rather than a nascent constellation of radically paranoid message boards and YouTube channels. The worldview of the Gamergaters and Pizzagaters was actualized and to a large extent vindicated in the 2016 election—an event that strongly suggested that the worst things about the internet were now determining, rather than reflecting, the worst things about offline life.
Mass media always determines the shape of politics and culture. The Bush era is inextricable from the failures of cable news; the executive overreaches of the Obama years were obscured by the internet’s magnification of personality and performance; Trump’s rise to power is inseparable from the existence of social networks that must continually aggravate their users in order to continue making money. But lately I’ve been wondering how everything got so intimately terrible, and why, exactly, we keep playing along. How did a huge number of people begin spending the bulk of our disappearing free time in an openly torturous environment? How did the internet get so bad, so confining, so inescapably personal, so politically determinative—and why are all those questions asking the same thing?
I’ll admit that I’m not sure that this inquiry is even productive. The internet reminds us on a daily basis that it is not at all rewarding to become aware of problems that you have no reasonable hope of solving. And, more important, the internet already is what it is. It has already become the central organ of contemporary life. It has already rewired the brains of its users, returning us to a state of primitive hyperawareness and distraction while overloading us with much more sensory input than was ever possible in primitive times. It has already built an ecosystem that runs on exploiting attention and monetizing the self. Even if you avoid the internet completely—my partner does: he thought #tbt meant “truth be told” for ages—you still live in the world that this internet has created, a world in which selfhood has become capitalism’s last natural resource, a world whose terms are set by centralized platforms that have deliberately established themselves as near-impossible to regulate or control.