Earlier this year, William Gibson unleashed his latest novel, The Peripheral. For many, the author’s return to the future, after 14 years spent writing about the present, was a welcome return. For Gibson, however, it looked as bleak as he’d left it in the 1990s.
Once upon a time (the mid-’80s to be exact), Gibson was the face of science fiction. His dystopian works warned of a near-future where computer technology was woven into our DNA—where a virtual datasphere played the dominant role in human interface. The genre was called cyberpunk.
In the pre-internet days, cyberpunk titillated readers with its underworld of hackers, anarchists and punks hell-bent on disrupting an autocracy of anonymous oppressors. Others, like Bruce Sterling and Neil Stevenson, followed Gibson’s lead, releasing tomes that had critics and academics taking sci-fi seriously for the first time. Thirty years after they party-crashed the literary scene—with the internet now in roughly 75 percent of American homes, 40 percent worldwide—cyberpunk is largely forgotten. Many of their predictions, however, quietly came to pass.
Gibson first coined the term “cyberspace” in a 1982 short story titled “Burning Chrome.” His landmark debut novel, Neuromancer (1984), further conceptualized the virtual network that Gibson described as “a consensual hallucination experienced daily by billions of legitimate operators in every nation.” Author Jack Womack said Neuromancer was less about predicting the future and more about affecting its lexicon. Though Gibson once saw them as “fantasies of anxiety,” today (like everyone), he’s an avid internet user. In just the past month, he’s tweeted nearly a thousand times.
Born near Myrtle Beach, N.C. in 1948, Gibson avoided the Vietnam War draft in 1967 by hopping a bus to Vancouver, where he had first brushes with the counterculture, hallucinogenic drugs and the work of William S. Burroughs. By the late ‘70s, Gibson fell under the spell of punk rock and science fiction, which he called a “derelict, but viable form.”
At a sci-fi convention in 1981 in Denver, he met Sterling and another budding writer named Lewis Shiner. The trio appeared at Austin’s Armadillo Con a year later, where they gave a panel on punk in science fiction. Shiner later noted the “movement solidified” there; Sterling dubbed Gibson’s “Burning Chrome” “a classic one-two combination of lowlife and high-tech.” Neuromancer dropped two years after, becoming the first book to win the Nebula, Hugo and Philip K. Dick awards—science fiction’s top honors.
Publisher Lawrence Person called it the “archetypal cyberpunk work.” Time listed it in their top 100 novels since 1923. The UK Guardian called Gibson “the most important novelist of the past two decades.”
Next came Count Zero (1986) and Mona Lisa Overdrive (1988), each set, like Neuromancer, in a megacity called “The Sprawl,” where sky and weather are machine-controlled. An advanced computer network—dubbed the “matrix”—is available to all inhabitants, who spend each waking moment there.
“Johnny Mnemonic,” a Gibson short from ‘81 also set inside the Sprawl, translated to the big-screen in ‘95. Starring Keanu Reeves as a cyber-trafficker who’d undergone surgery to install a data system in his head, the film was a critical and commercial flop. Similarities in cinematography and tone, however, were later detected in the 1999 megahit The Matrix (also starring Reeves), which Gibson called the “the ultimate cyberpunk artifact.”
Narrative quotations in The Matrix also derived from Gibson’s work. Laney, a character in the author’s Idoru (1996), for instance, looks for patterns in the flow of data; protagonists in Count Zero have instructions (kung-fu, helicopter piloting) downloaded to their brains, and Neuromancer features artificial intelligences trying to free themselves from human control.
His influence quickly spread to music, where the first notably-cyberpunk album—Sigue Sigue Sputnik’s Flaunt It (1986)—unleashed a sinister collage of pop slogans and gothic futurist soundscapes into the synthpop ferment. In the Gibson documentary No Maps for These Territories (1999), the author describes Neuromancer as “not a goth book, but kind of the same world that makes kids be goths.”
One such kid was Trent Reznor, of middling Midwest synth acts like Slam Bamboo and Exotic Birds. In 1989, he embarked on a solo project under the moniker “Nine Inch Nails.” The resulting Pretty Hate Machine, for all intents and purposes, ended the shiny ‘80s sound for good. Before it became landmark in alternative music history, Pretty Hate Machine was strictly cult. Critics bashed it. Yet a devoted core found in its pounding drum machines, aggressive synth textures and tortured human voicings a last connection in an otherwise depraved world.
Other Gibson-inflected works soon followed. Pop-punk Billy Idol’s ‘93 comeback attempt—titled Cyberpunk—was rife with overly stylized technophobic anthems that went nowhere. UK dance-rockers Jesus Jones anointed their album Perverse (also ‘93) the “first made entirely on computers.” (Doris Norton’s Personal Computer album of 1984 had it beat, but who’s counting?) If less than a masterpiece, Perverse pointed to the tech-driven DIY of a decade hence, where albums were recorded top-to-bottom on a computer, in a bedroom.
The biggest progenitors, however, of spatialized, abstract sound in the oncoming swarm of internet culture came from electronic dance music (EDM). Nascent ‘90s trends in Acid House, Jungle and Drum’n’ Bass employed fractured samples and undulating breakbeats to convey what Gibson called the “incomprehensible present.”
Richard D. James (aka Aphex Twin) was a UK-by-way-of-Ireland artist whose drum’n bass recordings seemed less for rave parties and more for cerebral, alienated geeks sitting alone in a bedroom contemplating the new open platforms of real cyberspace, now dubbed the World Wide Web.
Aphex’s Xylem Tube EP of ‘92 is one of the most unusual releases of the period. “Polynomial-C” begins ambient, but quickly goes dissonant, its stacked arpeggios a mind-trip that both exhilarates and traumatizes. “Tamphex” loops a sample from a tampon ad that feels hard-wired into our brains.
James’s only true peer in this style was Autechre, a Manchester duo equally interested in things that don’t exist. Autechre’s name, like many of its song titles, is a made-up word referring, essentially, to itself. Similarly, Gibson, in Memory Palace (1992), asserts: “We’ve always been on our way to this new place that is no place really.”
Never to be left in the cold, rock chameleon David Bowie responded to the tech explosion with Outside (1995), featuring Trent Reznor on a remix of the album’s “The Heart’s Filthy Lesson.” Bowie,—with Outside’s producer Brian Eno—had in fact already brushed up against one of the pioneers of cybernetics.
Anthony Stafford Beer was a Mancunian management theorist who’d been commissioned by the new communist government in Chile in 1972 to create a computer network (titled “Cybersyn”), made up of 500 telex computers to report variables in workforce conditions. Despite never working properly (not to mention a coup d’etat in ‘73 that overthrew the communists), Beer gained a passionate follower in Eno. The two men struck up a correspondence in ‘75, and soon, Eno collaborators like David Byrne of the Talking Heads and Bowie became acolytes too. (Bowie put Beer’s Brain of the Firm on his list of favorite books.)
Beer, in a 1964 lecture, spoke of the arrival one day of a smart network of connected devices—a so-called Internet of Things. Little did he know the U.S. Pentagon and the Defense Advanced Research Projects Agency (DARPA) had already commissioned such a project in ‘63. Essentially connecting the mainframes of university computers in the West, the original Internet sought to gather and protect information inside a virtual, computerized space in the event of a nuclear war. The internet, as it emerged in the 1990s, is its side effect.
As Gibson’s star reached its pop culture apex around this time, the very Cold War that brought about the internet’s creation was ending. The Berlin Wall fell in 1989; the Soviet Union collapsed two years later. Eighties rock band U2 were on hand in East Germany when the wall fell. There to reinvent themselves under the guiding hand of, you guessed it, Brian Eno, U2 would become the one rock band most closely associated with Gibson.
The author was tapped to appear in a televised documentary of U2’s ZooTV tour, supporting Achtung Baby, their comeback album of 1991. (William Burroughs also appeared.) They returned the favor by contributing incidental music to Gibson’s audio book version of Neuromancer in ‘94. Gibson also interviewed the band for Details magazine, wherein lead singer Bono reflected on fin-de-siecle celebrity, saying, “At first, when you’re reading stories about your life in the media…you feel violated. Then you start to realize that the person they’re describing has very little to do with you and is in fact much more interesting than you are.”
The pervasiveness of multimedia became an obsession on U2’s ‘93 album Zooropa, which opened with the phrase, “Vorsprung durch technik” (a ‘90s Audi ad slogan translating to: “A step ahead through technology.”) The technophobic LP ended 10 songs later with “The Wanderer,” starring country legend Johnny Cash as guest lead. The Man in Black’s bellow of lines like “drifting through capitals of tin” hover ominously over a burbling synths that warn of rushing too quickly into the virtual—something he knows cannot be stopped.
By ‘95, Eno helped U2 disappear into a side project under the banner “The Passengers,” where 14 imaginary film soundtracks turned the world’s biggest rock act into its own virtual reality. As cyberpunk became the syntax of its age, nothing was sacred. Even Superman, that comic book champion of the All-American Way, was suddenly (and without warning) slapped with a cyberpunk makeover.
After a surprising death in a 1993 issue of the DC staple, Superman’s writers scrambled for several years to continue the storyline. Then in ‘96, the brilliant idea formed to split the beloved superhero into two uncontrollable energy fields—Red Superman and Blue Superman. The techno-fied icon had to figure out how to unite his two zig-zagging energies to regain his powers and save the universe. It seemed a valid idea at the time, though by ‘98 no one quite knew how to work it, and with fan interest waning, DC dropped the cyberpunk Superman altogether, returning him to his traditional tights, cape and rubber boots.
By this time cyberpunk had pretty much faded anyway. Gibson’s newest novels were set in the present, which he’d been so instrumental in projecting—at least aesthetically. The internet of Year-2000 had made a smooth transition from technological fear into everyday utility. Dystopianism and Y2K hysteria were replaced by a benign internet, roused in things like the Meg Ryan-Tom Hanks romantic comedy You’ve Got Mail (based on the signature AOL email slogan). A notable exception in the cyberpunk fadeout was Radiohead’s OK Computer (1997).
The Essex band who’d hit in ‘93 with an alt-rock pastiche titled “Creep” had revamped their sound on 1995’s full-length, The Bends. With lamentations on plasticity and prosthetics, The Bends hinted that our days living separate from machines were numbered. By OK Computer, humans and machines were fully hybridized.
Yet where Gibson’s hard-boiled urchins of the techno-underworld sought a way out of the mind-control, Radiohead’s self-loathing slackers on OK Computer are numbed to the point of regression. Police oversee karma, airbags routinely save lives, and slogans like “God loves his children” hang over a populace of “paranoid androids.”
“Stay away from the future/Don’t tell God your plans/It’s all deranged/No control,” sang Bowie two years earlier on “No Control” from Outside. U2’s “Zooropa” too offered axiomatic simplicities like “Be all that you can be,” “Eat to get slimmer” and “Fly the friendly skies.” “Numb,” from the same LP, negates the trend, with guitarist The Edge murmuring, “Don’t move/Don’t talk out of time/Don’t think/Don’t worry/Everything’s just fine.”
For OK Computer’s “Fitter Happier,” a computerized voice spits out dictums like, “Comfortable/Not drinking too much/Regular exercise at the gym…A pig in a cage/On antibiotics,” which seemed lifted from a Gibson couplet in Idoru, which went: “Viciously lazy, profoundly ignorant, perpetually hungry…lives by itself, in the dark, in a double-wide, on the outskirts of Topeka.” The sensation was no longer one of a future world where computers dominate. It was the present.
Radiohead’s 2000 album, Kid A, proved the final transformation from mopey guitar-rockers to techno-rock avatars. Where U2’s disappearing act with the Passengers called for a name-change, Radiohead’s Thom Yorke simply became the voice inside the machine. (He credited Aphex Twin with Kid A’s inspiration.)
By this time, U2 had already turned their back on techno-rock experimentation. Where Kid A baffled critics and fans alike, 2000’s All That You Can’t Leave Behind was hailed as U2’s return to form. Bono became the love-him-or-hate-him mouthpiece of pop activism, lobbying government leaders and CEOs of mega-corporations to fund AIDS relief in Africa. Both acts, however, fell silent in the immediate aftermath of the September 11, 2001 terrorist attacks. Gibson did not.
He called 9/11 an event “outside of culture.” His book Pattern Recognition (2002) is possibly the first piece of post-9/11 fiction, which scholar Chris Vanderwees says was also the first conspiracy theory pertaining to the event. (Gibson’s main character surfs the internet for footage and opinions suggestive of a unified narrative, finding none.) Not surprisingly, the book was largely ignored.
As the U.S. (and England) waged war on the Arab world, Americans at home became accustomed to high-alerts of new domestic terror threats. A secret war was also being waged on the public’s privacy. Between 2000 and 2001, over a dozen internet privacy laws were introduced to Congress. In the aftermath of 9/11, however, all of them were abandoned, and in October ‘01, the Patriot Act passed. It greatly expanded the government’s ability to surveil its citizenry.
In 2002, DARPA (the original creators of the internet) opened the Office of Information Awareness, with the intent of scanning and collecting every piece of personal data that comes across the web—the assertion being that, with enough data collected, the government could predict who might engage in nefarious crimes. Public outrage ensued; the program quickly shut down. Nothing, however, changed, except that businesses such as AT&T, Yahoo and Google now made the same information available to covert law enforcement operations, only through user agreements. 9/11, in essence, provided a license for governments to develop spying systems that affect us all.
Then, in 2007, an American G.I. in Iraq—an intelligence specialist named Bradley Manning—received an internal video of a US Apache helicopter shown committing collateral murder on a crowd of unarmed Iraqi civilians (including two children in a van). Manning burned a copy of the video (and other classified military files) onto a CD-ROM, telling suspicious onlookers that he was just listening to Lady Gaga. The material—the video, as well as 400,000+ internal memos (not all from Manning)—told the story of vast war crimes by the American military in Iraq and Afghanistan, as well as plans to spy on countries throughout the world. Flowing from an internet site called WikiLeaks, it galvanized the anti-war movement at home and abroad.
The face of WikiLeaks—Julian Assange—became the first major hacker-celebrity in political activism. After being accused of rape in late 2010, Assange was granted political asylum at the Ecuadorian embassy in London. Manning was charged in 2013 with 17 counts of espionage and theft and sentenced to 35 years in prison.
That June, a former CIA tech assistant named Eric Snowden leaked thousands of internal documents pointing to the National Security Agency’s mass surveillance of US citizens through telecommunications and the internet. Former NSA analyst Russell Tice, in the days after the Bush Administration exited the White House, had already admitted they had access to, in his words, “everything.” The candidate to replace Bush—Barack Obama—defended, on the campaign trail in ‘08, the necessity of wire-tapping and surveillance. The programs not only continued after he took office, they expanded.
The blame could be laid entirely at the feet of the government, if we didn’t know that much of the legislation that has zapped personal privacy comes directly from lobbyists funded by the tech companies—the same ones selling us our phones and our computers, as well as the platforms where we enjoy music, film, literature and more.
Google, Twitter and Facebook, lauded as broadening the scope of human potential, in fact, built algorithms to drive us to predictable results. Cookies store information on individual user preferences. They have, in essence, created business models that are a dream come true for the CIAs, FBIs and NSAs of the world.
Facebook has nearly a billion users, with tons of personal data on each one, proving that plenty of individuals are willing to provide private information to get something that is free and fun. Simply put: We’ve allowed ourselves to be smitten. The computer is now miniaturized, or, as Bruce Sterling predicted, “adorable.” Christopher Shin, the engineer of Cellebrite, a device that aids the U.S. government in collecting information from cellular users, contends that the iPhone holds more personal information than any other device on the market.
French psychoanalyst Jacques Lacan wrote extensively during the 1950s of the “gaze,” which he saw as a projection of power—not of a real person who wishes malevolently to deprive us of our independence, but the result of a pervasive struggle for self-mastery. Likewise, Facebook creator Mark Zuckerberg argued recently that sharing personal information in a new platform, like the internet, has evolved in quite normal, social ways. Post-structuralist Michel Foucault took it a step further during the 1960s, linking the gaze to forms of surveillance. For Foucault, the notion of seeing things while being watched may liberate some marginal elements, but it shatters sovereignty. We literally become blind to reality.
In the end, many of the same artists who warned of the dangers in taking the full technological plunge are now willing accomplices.
In 2007-08, both Radiohead and Nine Inch Nails, leery of file-sharing in the past, gave away free downloads of their newest albums. Radiohead later decried the loss of artists’ profits with the advent of streaming platforms like Spotify, but have yet to remove their music from it.
Last year, U2 offered its newest single, “Invisible,” for free, with Bank of America promising to donate $1 to RED (Bono’s foundation to fight AIDS) for each user download. This fall, the band took it one further and uploaded their newest album to every iTunes account without users having asked for it. All three acts have done soundtracking for Hollywood movies and Broadway musicals.
NIN’s Trent Reznor soundtracked the Aaron Sorkin/David Fincher film The Social Network, a biopic of Facebook’s Zuckerberg, who is portrayed as a self-serving pseudo-intellectual caught in an infringement case for stealing the social platform’s original idea from a pair of jocular lugheads, to say nothing of his part in co-opting user privacy across the globe. The resolution of The Matrix series (the epically jump-the-shark film known as Revolutions) has Keanu Reeves’ messianic Neo making peace with the machines. It seemed the mantra of our age.
Midway through Radiohead’s “Let Down,” from ‘97’s OK Computer, the languid psychedelic ballad is suddenly bolstered by a flutter of computer blips and bleeps, which crescendo into a soaring Yorke vocal, where, after feeling “crushed like a bug in the ground,” the singer croons of one day growing wings.
If one of the jobs of the artist is to watch those who watch us—to monitor the ways in which our liberties have been encroached upon—it is hard not to think, on some level, they’ve flown the coop. If we stop to ask how we got here, we may look back and find the signs embedded in cyberpunk literature of 20-30 years prior. We may then wonder how we might better have heeded its warnings. But it is too late. Privacy, under the current paradigm, is essentially dead. The question of how our media, and in particular, our artists uphold that paradigm going forward remains oblique.