Last fall, around the time Britney Spears’s memoir The Woman in Me was published, I went to the Brooklyn stop of Liz Phair’s 30th anniversary tour for her debut album Exile in Guyville. Exile is one of the epochal albums of the 1990s, a Gen X classic; it came out when I was a freshman in college, and every track on it reminds me of a certain lonely dorm room, a wintry campus, an unfinished term paper on W.H. Auden. The audience, made up of people who mostly looked just like me, was there to relive those memories of a very different time. We sang along to every note. Phair—who in the 90s was a famously nervous and unpredictable performer—didn’t engage in much stage banter, but she looked relaxed and happy. For the very last song of her encore, she played the song she’s most famous for—not from Exile but from her “reboot” album, Liz Phair, released ten years later: “Why Can’t I?”
Article continues below
Phair, some people will remember, briefly reinvented herself in the 2000s (in her mid-thirties) as a shiny corporate pop star with an album mostly written by Avril Lavigne’s songwriting team, The Matrix. “Why Can’t I” was featured in the teen comedy Ten Things I Hate About You. In the press, she was raked over the coals: in fact, it was my friend Meghan O’Rourke—we met in college that same fateful year, 1993-94—who wrote the most damning piece in the New York Times:
Where P.J. Harvey’s wailing or Courtney Love’s anger were shamanistic and almost feral, Ms. Phair was reassuringly human in her appetites, her arrogance, her fear, her inability to quite hit that high note. Her sexual frankness went hand in hand with a recorded-in-the-garage immediacy… Her signature style of drawing a word out across several notes in a kind of dull trill made a mockery of all that was feminine about singing… [Her new album] lacks the distinctive flair and sass of Ms. Phair’s earlier work, and has little of its savvy insight. The songs are catchy, replete with pop hooks, but they’re relentlessly peppy, and often Ms. Phair sounds as over-carbonated as a 13-year-old full of Diet Coke and Pop Rocks. The slick production diminishes her boldness the same way those child-size T-shirts emblazoned with the word ”Sexy” always seem to make a mockery of their wearers.
As a youngish Gen Xer, my teenage and college years fall on one side of a major cultural divide: for the sake of argument, you could date it to April 1999, when Britney Spears appeared in her childhood bedroom on the cover of Rolling Stone. Before that was the “alternative” 1990s: the years when punk broke, when grunge and noise-rock and Riot Grrrl and the glorious MCs of hip hop’s golden age were mainstream, when slackers and dropouts and utopians ran the scene.
In 1999 all that was replaced by a new crop of shiny teen-dream confections: Britney, Christina, N’Sync, Backstreet Boys, the Spice Girls. There was a particular feeling of generational horror for Gen Xers who by 2000 were being decisively displaced by millennials as the cultural vanguard, and more importantly, the key marketing audience. In the space of a few years, the sex-positivity of third wave feminism—adult women insisting on their own autonomy, owning their bodies—was swept away by a regressive cultural obsession with virginity, purity, and teenage girls performing for the leers of adult men. (And then, if purity culture wasn’t bad enough, came the confounding, value-free gibberish known as indie sleaze.)
Gen Xers always knew this would happen: the gradual folding of everything that could possibly be called “culture” into one image-spectacle-and-sensorium corporate machine.
That late-90s cultural shift was and is important, and never more so than now, when the survivors of the purity era (Spears, the Dugard children, Lindsay Lohan, Miley Cyrus, the once-child-actors exploited by Nickelodeon) are taking stock of their lives. But watching Liz Phair perform songs from both sides of that shift, I was struck by something else: the 1990s, the rageful and quirky and ironic days of my youth are another successful franchise. If I wanted to, I could fill my calendar with nostalgia shows: Dinosaur Jr., Pavement, the Pixies, Bikini Kill, Into Another, Orange 9mm, Quicksand, Helmet, Magnetic Fields, Jawbox, and Sunny Day Real Estate are all on tour this year.
Thanks to the all-but-monopoly control Live Nation has over the concert business, most of my ticket money—$60-$200 at a time—flows to one corporation. And Live Nation doesn’t care about cultural shifts, ideological allegiances, or what people in the 1990s might have called “semiotic fields”: concert promoters are happy to sell culture as a buffet, especially at the big festivals like Coachella, where everyone gets something they want.
Of course, Gen Xers always knew this would happen: the gradual folding of everything that could possibly be called “culture” into one image-spectacle-and-sensorium corporate machine that thrives on endless niche differentiation as a way of metastasizing its market share. It’s there in the song Kurt Cobain wrote around the dumbest chord progression he could think of. You could say we saw late capitalism coming in real time, planned for it, in many cases enabled and amplified it.
In 1993, the Year of Liz Phair, writing for The Baffler, Thomas Frank explained the process in an epochal essay, “Why Johnny Can’t Dissent.” Noting that Nike and MTV were able to co-opt major icons of the counterculture (William S. Burroughs and Gil Scott-Heron, respectively) for their ad campaigns without anyone caring, Frank warned that corporations were assimilating the language and attitudes associated with post-WWII youth culture, from the Beats to punk rock, for themselves: Disrupt everything! Question authority! Overthrow the hierarchies! “Our businessmen sound like rebels,” Frank wrote, “and our rebels sound more and more like ideologists of business.”
In 2024, this is very old news. (It was also old news in 1993: Frank was rephrasing, or reinventing, arguments by Theodor Adorno, Guy Debord, Pierre Bourdieu, and Walter Benjamin.) But it’s worth noting just how advanced we’ve become: that is, how much of contemporary culture consists of various kinds of wall-to-wall fandom within a single brand or IP universe.
Wall-to-wall consumption within a single corporate entity is so ubiquitous today it’s hard to imagine not doing it.
To choose one extreme example (but different in degree, not in kind), the Disney Adult. I met my first Disney Adult about twenty years ago; he was a mild-mannered, unexceptional student in one of my creative writing classes, and when I asked why so much of his material dealt with Walt Disney, he replied that he considered WD the greatest artistic genius who ever lived. I sat there waiting for him to start laughing, but he kept a completely straight face, and so I did, too.
It was one of the most chilling experiences I’ve had as a teacher: not because Disney Adults are monsters, but because I was seeing something altogether new in the world, the pseudo-religious worship of a corporate brand. (There are in fact scholars who consider Disney Adults to be members of an emergent religion.)
Remember I said different in degree, not in kind. Wall-to-wall consumption within a single corporate entity is so ubiquitous today it’s hard to imagine not doing it, whether you are an Apple Adult (MacBook, iPhone, iPad), an Amazon Prime Whole Foods Adult, a Google Adult, a Meta Adult, or (like me) all of the above. Let’s remember the largest US publisher, Penguin Random House, narrowly missed out on acquiring Simon & Schuster last year, which would have given one publisher a 50 percent market share in the US book market.
In the meantime, teenagers and young adults who have grown up immersed in a Disney Adult world love whatever isn’t marketed to them: used records, thrift stores, flip phones, stick-and-poke tattoos, and homemade haircuts; I recently bought a hat at a record store in Lawrence, Kansas, that reads, “Analog is the Future.” If that seems like a feeble form of anti-corporate dissent, consider the potency of analog protest, where the simple act of putting up a cheap tent and handmade sign on a university lawn reverberates around the world as an act of war.
We live in an era in which corporate and pseudo-corporate narratives of all kinds seem unconquerable—but if you poke at them, they roll up into a ball like potato bugs and demand the violent protection of the state. When the same IP has been recycled one too many times, it starts to sound nonsensical, and right now, just as monopoly capitalism seemed to be winning on all fronts, nonsense is breaking out all over.
*
I’ve been thinking about the word “franchise,” as a way of describing not just the extension of a given cultural property (or era) into a potentially infinite series of adaptations, revisions, reboots (the Star Wars franchise, the Marvel franchise, the Mattel franchise) but as a way of describing how contemporary culture wants us all to behave, that is, to franchise ourselves, to become an all-encompassing personal brand.
Ideally, this is a cradle-to-grave process only one step removed from The Truman Show, as demonstrated by celebrities who’ve been in the public eye since childhood—Zendaya, Taylor Swift, Olivia Rodrigo, Harry Styles, Ariana Grande, Selena Gomez, Justin Bieber, the Jenner siblings, the Olsen twins—and who represent a cultural ideal of monetizing their fame across multiple platforms. Viewed from a labor-rights perspective, or a history-of-show-business perspective, these performers are a huge success: they have cornered the market on themselves. Forget tragic, exploited, ruined celebrities of the past exploited by managers, the Mob, and/or the press: these young women are fabulously wealthy and running their own show.
Writers of autofiction tend not to spend much time on the Internet, viewing it (wisely) as competition. Other writers do.
But I don’t really care about them: I care about people in my own orbit and cultural sphere, who have been tasked with living the same way for at least 20 years: that is, under the pressure and the expectation that you will mine your own life for content; to become a brand, an influencer, just by being yourself. The imperative gets stronger the younger you are, and the more you’ve lived your life exposed to the Internet. But it affects anyone who engages in any kind of cultural production: At this point it’s so baked into the culture that it’s a given and a joke.
Since the late 2000s—that is, since the rise of the first all-pervading social media platforms—self-franchising has dominated the literary world in two different ways. The first, obviously, is through the emergent genre* known as autofiction: Jenny Ofill, Rachel Cusk, Teju Cole, Sheila Heti, Chris Kraus, Sally Rooney and Karl Ove Knausgaard, among others, who have marketed a sensibility, a lifestyle, a vibe, directly through their writing, which is famously not about stories, characters, dramatic events, but rather about a fictional stand-in for the author, who has a highly wrought sensibility and a lot of leisure time to be able to think about things.
Autofiction is a form of solipsism refined into a high art: as Knausgaard once put it, echoing Margaret Thatcher, “in reality, there is no such thing as the social dimension, only single human beings… if you want to describe reality as it is, for the individual, and there is no way, you have to really go there.” It’s also highly antagonistic toward the novel as a preexisting genre: writers of autofiction view fictional characters and plots as artificial, manipulative, banal, and escapist. Sally Rooney drove this point home in her 2021 novel Beautiful World, Where Are You?, when Alice, a young Irish novelist and stand-in for the author, writes the following to her friend Eileen:
The contemporary Euro-American novel… relies for its structural integrity on suppressing the lived experiences of most human beings on earth. To confront the poverty and misery in which millions of people are forced to live, to put the fact of that poverty, that misery, side by side with the lives of the “main characters” of a novel, would be deemed either tasteless or simply artistically unsuccessful. Who can care, in short, what happens to the novel’s protagonists, when it’s happening in the context of the increasingly fast, increasingly brutal exploitation of the majority of the human species? In this world, why does it matter?
Writers of autofiction tend not to spend much time on the Internet, viewing it (wisely) as competition. Other writers do. The more visible form of self-franchising for writers over the last 20 years has been cultivating an Internet presence that augments and amplifies your career and makes you widely known, even among people who haven’t encountered (and may never encounter) your books.
So many writers have done this it’s difficult to keep track, but some of the best-known examples would include Roxane Gay, Emma Straub, Alexander Chee, Kiese Laymon, Viet Thanh Nguyen, Chanda Prescod-Weinstein, Isaac Fitzgerald, Saeed Jones, Brandon Taylor. There are also many writers who had a moment on the Internet and then gave it up. (Colson Whitehead, once upon a time, was a brilliant and scathing user of Twitter.) And then there are some genuinely tragic tales: writers who’ve used their platforms for outright fraud (not naming names, no thank you) and writers who have had spectacular flameouts due to addiction and mental illness.
Understanding these two forms or genres of writing as varieties of the same thing, operating under the same impulse, has helped me pinpoint the most striking feature of autofiction: that it wants to be the Internet without the Internet. Autofiction preserves the same qualities as social media—dailyness, ephemerality, immediacy, intimacy, ambivalence, inconclusiveness—but only as a form of what Bakhtin called monoglossia, language in which only one person speaks. At its best, it has a meditative, essayistic quality—people will always point to W.G. Sebald as the classic exemplar of this style—but I’ll be brutally honest and say I’d give up Sebald any day in favor of Susan Orlean’s 2020 drunken Twitter thread, in which the responses were half the fun.
To me, in other words, this era of self-franchising has been much more fun to witness on the Internet than on the page; autofiction has always been an unsatisfying, self-serious substitute for the anarchic characters you could encounter for free, in real time, in interactive form. The paradigmatic example in my mind is Brandon Taylor, who became well-known for his argumentative, self-dramatizing voice on Twitter years before publishing his first—highly autofictional—novel Real Life, which I found stiff and almost impersonal by comparison. While reading it I experienced a strange sense that something was missing: the sensibility of the “real,” online Brandon.
Over on the (actual) literary Internet, it’s been nothing but frustration, disillusionment, and disengagement for years.
I use the past tense here because in the literary world the epoch of self-franchising seems to be coming—lurching, or dwindling—to a close. Literary trends always have a sell-by date, and autofiction reached it around 2020, becoming a) a recognizable subgenre writers can argue about (as in, “She calls her new book autofiction but it’s really just memoir”) b) self-critical, and/or c) a parody of itself. I felt the ground shifting in 2021 when Taylor responded to Beautiful World, Where Are You? by rolling his eyes, in the New York Times Book Review: “It [feels] as though we’ve reached a point in our culture,” he wrote, “where the pinnacle of moral rigor in the novel form is an overwhelmed white woman in a major urban center sighing and having a thought about the warming planet or the existence of refugees.”
Over on the (actual) literary Internet, it’s been nothing but frustration, disillusionment, and disengagement for years—first with waves of writers exiting Facebook, then the-platform-formerly-known-as-Twitter, and finally, in plenty of cases, social media altogether. Some writers, it’s true, have tried migrating to newer platforms (Bluesky, Mastodon, Threads) or reluctantly returned to the old ones with a mixture of disgust, hand-wringing, and resignation. But the vibes are off. In a recent piece for Esquire, “Why Are Debut Novels Failing to Launch?”, Kate Dwyer (paraphrasing Kyle Chayka) pinpoints the point of absurdity social media marketing for writers has reached, when most of the content has nothing to do with books at all:
Connecting an artist’s biography to their art—or, by another name, creating a parasocial relationship between readers and authors—has long been an effective marketing strategy. But for debut authors, it now goes beyond writing personal essays and includes becoming a bona fide social-media influencer…These days, “in order to get exposure, you have to make the kinds of content that the platform is prioritizing in a given moment,” Chayka says. On Instagram, that means posting videos. Gone are the days of the tastefully cluttered tableaux of notebooks, pens, and coffee mugs near a book jacket; front-facing videos are currently capturing the most eyeballs. “A nonfiction author at least has the subject matter to talk about,” Chayka says. (Many nonfiction writers now create bite-size videos distilling the ideas of their books, with the goal of becoming thought leaders.) But instead of talking about their books, novelists share unboxing videos when they receive their advance copies, or lifestyle videos about their writing routines, neither of which convey their voice on the page. Making this “content” takes time away from writing, Chayka says: “You’re glamorizing your writer’s residency; you’re not talking about the work itself necessarily.”
As Dwyer acknowledges, this partly has to do with the decline of social media and the big-platform Internet itself, which has been happening in front of our eyes for many years, and the correspondingly profound shift in the way our culture views technology, from the blazing techno-optimism of the 1990s and early 2000s to the Internet’s current favorite word for itself, Cory Doctorow’s coinage, “enshittification.”
What used to be fun or at least engaging—the way ideas, theories, and trends used to get hashed out online, giving rise to new essays, new books, even new organizations, like the feminist writers’ advocacy group VIDA—has now become routine, counterproductive, and cynical, even to those who still participate in it. The content machine is still as hungry as ever, but less and less value and satisfaction seems to come out of it.
Psychologically, if not economically, Gen Z and Gen Alpha have seen the machine, know what it can do, and are already one their way to finding an alternative.
I found the perfect analogy for this in two TV shows my writing students told me to watch this past spring, The Boys and Gen V (one is a spinoff of the other, which is based on a comic book series). Both shows exist in the same superhero-universe narrative you might find in Marvel, DC, or a thousand other variations, but here the superheroes (the small percentage of the public born with superpowers) are primarily interested in marketing themselves. In Gen V, which takes place at an elite academy for young superheroes, only a tiny number of the students are allowed into a program that teaches actual skills (saving cities, battling bad guys); the rest of them are enrolled in influencer school, competing for eyeballs with thousands like them.
Watching Gen V—a show largely, if not entirely, aimed at a Gen Z and Gen Alpha demographic—I felt like I was witnessing the collapse of the franchise model of cultural life, the era of people-as-platforms, the celebritization of everything. These kids know better. Even the ones trapped in a self-promotion economy understand it’s slowly killing them and want a way out, when they’re not immobilized by depression, despair, and self-harm. Psychologically, if not economically, they have seen the machine, know what it can do, and are already one their way to finding an alternative.
This is essentially the thesis of Anna Kornbluh’s new book Immediacy: The Style of Too Late Capitalism, published by Verso earlier this year. The basic logic of cultural production since 2000, Kornbluh says, is that consumers want more stuff, more content, to fill every corner of their lives. Now well into the second decade of the streaming era, we’ve seen whole eras, whole life-cycles, of media come and go: cycles of hope, disappointment, careers built, careers destroyed, in pursuit of the One Big Stream, or maybe better called the One Big Mind, which reads, watches, listens to everything, a bottomless well of curiosity and desire and anxiety into which an infinite stream of image/text/video can be poured.
But the real problem, Kornbluh says, is that we’ve been so eager to embrace the style of immediacy that we’ve come to identify with this form of production itself. Autofiction is fascinated with contemporary Internet-driven self-franchising in the same way the Futurists and Vorticists loved the technological marvels of the early 20th century: telephones, film cameras, offset presses, machine guns. What autofiction is telling us, in Kornbluh’s language, is that our selves are the only thing we have left to sell:
Since the last quarter of the twentieth century, economic restructuring in the Anglophone world… has sculpted every aspect of social life to the model of market competition, blazing new frontiers in reconceiving human being as human capital. This has involved channeling wealth upward by defunding public institutions, slashing corporate tax rates, responsibilitization—making individuals and families bear the costs of social reproduction like education and health care—and commodifying the commons and the body… These are some of the many elements David Harvey, Silvia Federici, Ashley Dawson, and others deem “the new enclosures”…Enclosure is inscribed as literature in gestures of privatization and prosaicism… [Autofiction] encloses and reifies the substance of real bodies and real identities, excluding the less phenomenal, more speculative processes of fictionality and figuration.
It’s no wonder, then, that the purpose-built entertainers of this era have been child stars who have never not been performers, who have a catalog of consumer goodwill going back to their pre-adolescent years, and who have been in the spotlight so long that even relatively casual fans know way too much about them. Child stars are the avatar of the personal brand and the 21st-century entrepreneur, because, unlike early generations of child performers, they’re not products or objectified; they’re agents of their own destiny, who often (sometimes spectacularly) take control of their own marketing and merchandising as young corporate titans. Their enclosures, their platforms, their brands, their fan bases, are huge.
What’s different about the last 20 years is the compulsory nature of self-franchising in alliance with our particular phase of capitalism: writer as persona, persona as brand.
In an essay I posted last summer about consensus culture and the fame economy, I wrote that in the literary world, where we don’t have former child stars selling out arenas and causing earthquakes, relying on self-promotion and the language of fame has brought about diminishing returns and made a lot of writers miserable. Yes, autofiction in one guise or another has been around since the dawn of literary time, and yes, as Kate Dwyer says, writers have had parasocial relations with their fans going back to the origins of modern media, but what’s different about the last 20 years is the compulsory nature of self-franchising in alliance with our particular phase of capitalism: writer as persona, persona as brand. This is a problem in the lives of artists and the arts economy but it’s an even bigger failing, a more basic failing, in our understanding of what art is.
Along these lines, I was struck by Sheila Heti’s recent note of admiration in her eulogy for Alice Munro in the New York Times, titled “I Don’t Write Like Alice Munro, But I Want to Live Like Her:”
Fiction writers are people, supposedly, who have things to say; they must, because they are so good with words. So people are always asking them: Can you say something about this or about this? But the art of hearing the voice of a fictional person or sensing a fictional world or working for years on some unfathomable creation is, in fact, the opposite of saying something with the opinionated and knowledgeable part of one’s mind. It is rather the humble craft of putting your opinions and ego aside and letting something be said through you.
Alice Munro, it’s true, almost never appeared in public, published very little nonfiction, and communicated with the world almost entirely through her work: the opposite of a self-franchising writer. Humility, no doubt, was always part of her appeal. It’s interesting and novel to hear a writer of autofiction recognize that “a fictional person” or “a fictional world” is a valid artistic construct—how times have changed! But what Heti gets wrong—perhaps can’t fathom—about Alice Munro is that “Alice Munro” is not a lifestyle brand. Humility and self-effacement, while potentially valuable as strategies for artistic survival, aren’t the point: the point is to make something, to assert something, that has never existed before.
Fiction is about alterity, that is, otherness. The word comes from the Latin fingere, “to form, to contrive.” To think of fiction as an egocentric appropriation of the real (as Chris Kraus describes it in I Love Dick) is to look at it exactly backwards. As I’ve written before: fiction uses the raw materials of the real to create something outside it. Which is why in my view there’s nothing wrong with autofiction or autobiographical writing of any kind: just with literal, narrativized, self-limited, stylized autobiography that proposes itself as its own limit.
Let me draw out this idea for a moment with another foundational piece of Gen X music, Sonic Youth’s “Schizophrenia,” the opening song of their 1987 album Sister. “Schizophrenia” opens with a thudding 4/4 beat on muffled tom-tom drums, and then a very strange sound on the guitar: not even a chord, technically, just a G with a Gb, its major seventh, played in an alternate tuning on unison strings. G and Gb are only a semitone apart, but played this way they create a wide, weird-sounding interval. (You can reproduce the effect in standard tuning by playing an open G string and Gb on the B string below it.)
What’s so important about these opening notes? Sonic Youth, which at that time had a very small audience of New York downtown music aficionados, played all their songs on manipulated guitars. As Lee Ranaldo, one of Sonic Youth’s two guitarists, wrote recently:
In the early days we had mostly cheap guitars that wouldn’t stay in tune, so we were more prone to using them as noisemakers—either roughly tuned or “detuned,” sticks and screwdrivers under the strings, etc (only citing John Cage’s preparations much later lol) and from that point on… we gravitated away from standard tunings completely. The main advantage of this, for us at that time, was: a) we liberated ourselves from standard chord patterns and structures—indeed, liberated ourselves from even knowing what key we were playing in oftentimes. We had to invent our chords, which forced us to LISTEN more rather than rely on a codified system of chords and lead patterns that every guitar player knows, and b) it caused our music to sound radically different to that of every other rock band at the time, who were all playing standard chords in standard tuning, further setting our music apart.
To me—as much now as when I first heard it at age thirteen—“Schizophrenia” is beautiful because it’s so profoundly lonely. From a child’s perspective—at least from this child’s perspective—the late 1970s and 80s were that, a time of deep abandonment, alienation, and hollowness. Something was deeply, pervasively, pathologically wrong. I didn’t need to know a thing about “Sonic Youth” to understand that. (Decades later, through a mutual friend, I met two members of the band, who turned out to be friendly, unremarkable people.)
As adults we get to have fewer and fewer of these startling, no-context, unfiltered experiences of encountering art, so it’s easy to forget that ideally that is how it should be encountered: not through association or recommendation or vibes, but out of nowhere, with full force, when you’re not prepared, and you have no idea who or where it came from.
In Immediacy, Anna Kornbluh argues for a return to what she calls “mediation,” an all-encompassing term that for her (as far as I can tell) means any kind of pre-existing form, like the conventions of the imaginative novel, which keep getting reinvented by writers like Colson Whitehead and Justin Torres (Blackouts, if you haven’t read it, is a fantastic example of a novel that detaches the reader from present-day reality and enters a realm of dreamlike uncertainty and unresolvable longing).
I’m all for that, but I wonder if she’s missing the real point of her critique, which is that artists (like everyone else) would do well to start thinking about the values of privacy and opacity, or what in European Internet law is called “the right to be forgotten.” I don’t mean shutting off all of our social media accounts and moving to (what’s left of) Greenland; I mean thinking very carefully about the extent to which we confuse our personhood and our work. The statement if your books aren’t successful, it’s because you failed to be an interesting public personality is only a hairsbreadth away from the statement if your books aren’t successful, it’s because you’re not a worthwhile person. Both imply the same thing: it’s really you, not your art, that’s being sold.
A much healthier attitude, to paraphrase Fugazi, is to look at the reader (or “the public,” if you prefer) and say, I don’t owe you anything. Just the book. Just the words. A work of art should be its own explanation.