Category: Story

Home / Category: Story

It’s such a simple question Rachael (Sean Young) asks Rick Deckard (Harrison Ford) in Ridley Scott’s 1982 film Blade Runner: “Have you ever retired a human by mistake?” They’ve just met in Eldon Tyrell’s opulent offices, and Deckard, a replicant bounty hunter, has come to interview Rachael as a means of testing the LAPD’s replicant-detecting Voight-Kampff device. Deckard’s equally simple response— “no”—comes without hesitation; he nonchalantly shrugs it off as though he’s never bothered questioning the supposed difference between humans and the androids he’s contracted to kill. The entire exchange takes about five seconds, yet it encapsulates everything that has fueled the public’s decades-long love affair with Blade Runner’s existential dread: What are humans? What myths do they take for granted? What have they been missing?

Over the past 35 years, Blade Runner has (rightly) been lauded for its artistic legacy and chillingly prescient vision. In that time it has also often (rightly) been critiqued for its flaws when it comes to its representations of gender and race. Scott’s film is full of female characters who are all replicants, yet their literal objectification is barely explored; East Asian aesthetics pervade its vision of dystopian LA, yet Asian characters are largely background players; its cyborgs are meant to be stand-ins for oppressed minority groups, but few, if any, minorities are actually present on screen. These shortcomings have become so apparent in the decades since the film's release that Blade Runner has become a shorthand for exploring those topics, even if only to show how sci-fi stories like it can succeed or fail at addressing them. So when word of a sequel arose, the question immediately became whether or not it would update its view of humanity along with its view of the future. The answer to that question, unfortunately, is: not so much.

(Spoiler alert: Minor spoilers for Blade Runner 2049 follow.)

Director Denis Villeneuve’s Blade Runner 2049 is certainly as obsessed with the eroding distinction between human and artificial life as Scott’s film was. Its production design—in 2049, Los Angeles is so overpopulated it looks more like a server farm than an actual human habitat—is just as breathtaking as the original’s. And its technological advances, like the tiny hover-pods that allow Tyrell’s successor Niander Wallace (Jared Leto) to see or the new biologically engineered replicants grown in Matrix-like cocoons, are executed in a way that propels the franchise 30 years into the future. Yet for all the attention paid to updating the sequel’s physical details, its three-hour plot does little to concern itself with anything beyond the depths of its white male protagonists, reducing white women to tired archetypes and utterly sidelining nonwhite characters.

Ford’s reprised Deckard and Ryan Gosling’s blade runner K both have complex inner lives behind their macho reticence. K, like Deckard, doesn't think critically about his job or the replicants he executes. His demeanor remains a mask for the audience to endlessly consider in long, uncut close-up—until a revelation forces him to question his identity, and his world falls apart repeatedly across his face. Deckard describes the heart-wrenching motivations for his self-exile and the agony that has accompanied it; Leto’s Wallace monologues at length about his megalomaniacal ambitions to play god to a species that can overrun humankind. Each man gets a story, and each story gets an airing.

Despite their unrelentingly pedestrian Psych 101 woes, these three men still manage to take up 95 percent of the emotional frame on screen, leaving little room for the women around them to have their own narratives. There’s manic pixie dream girlfriend Joi (Ana de Armas), whom K has literally purchased, à la Her. K’s boss, Lt. Joshi (Robin Wright), berates him at work and then invites herself over, drinks his alcohol, and comes on to him. Mariette (Mackenzie Davis), the sex worker with a heart of gold, repeatedly comes to K’s aid (in every way you can imagine). Wallace’s servant Luv (Sylvia Hoeks) has the most tangible personality, yet she’s obsessed with pleasing Wallace. Even Sean Young’s Rachael makes a cameo as a plot device for Deckard, embodying the final archetype—the martyred Madonna—of this Ultimate Sexist Megazord. Three female characters, not one one of them voicing an ambition or desire that does not pertain to their male counterparts. Just because 2049’s future has females doesn’t mean its future is female.

Yet in a deeply ironic twist, the plot itself hinges entirely on their presence; without women, be they human or replicant, the secret K discovers that sends him on his harrowing mission wouldn’t exist. If Villeneuve and screenwriters Hampton Fancher and Michael Green recognized this, they must have ultimately decided they could accomplish the same goals without having to imbue those critical female characters with the same humanity as their male counterparts. (After all, when a character’s function is more plot point than passion, why bother giving it stage directions?) Several moments almost comment meaningfully on women’s disposability—Wallace’s casual gutting of a newborn female replicant; a giant, naked Joi addressing K blankly from an ad—yet each time, they become sad moments in a man’s narrative, rather than being recognized as tragedies for the women themselves.

Unsurprisingly, the problem worsens when viewed through a racial lens. Gaff (Edward James Olmos, now in a retirement home), a lab tech (played by Wood Harris), and two shady black-market dealers are the only men of color with more than one line; none have identities beyond their use to K. The only visible woman of color is one of Mariette’s nameless coworkers. (While de Armas was born in Cuba, her grandparents are European.) The third act finally delivers a plot twist that insists the story is not actually about K and Deckard—except the action continues to focus on them anyway.

As critic Angelica Jade Bastién recently noted at Vulture, mainstream dystopian sci-fi has always been obsessed with oppression narratives. While it returns over and over again to the downtrodden-rises-up-against-the-subjugator model, the genre has always had a remarkable ability to overlook the persecuted groups—people of color, women, the LGBTQ community, people with disabilities—whose experiences it mines for drama. White creators, men in particular, tend instead to whitewash their casts, imagining themselves as both villain and hero. Rather than simply putting the real thing in the story, their tales become metaphors for the real thing. Blade Runner 2049 falls into this trap: Even as Wallace grandstands about "great societies" being "built on the backs of a disposable workforce," everyone the movie deems powerful or worth exploring is still white and almost 100 percent male, relegating those disposable workforces’ descendants to the story’s incidental margins.

This was one of countless missed opportunities Blade Runner 2049 had to transform the franchise into not just a staggering aesthetic and technological achievement but also an incisive read of 21st century society. In the wake of Mad Max’s recent overhaul—which, while imperfect, managed to redeem many of its predecessors’ flaws—this misstep is especially disappointing. Yet like Deckard’s hurried brush-off of Rachael’s honest question in 2019, 2049’s filmmakers have attempted to tell a story about personhood in 2017 without actually considering the urgent politics that surround who gets to be a person in 2017. And in an era that begs for this kind of reinvention, its failure flattens its message into one more retirement for the books.

Replicant Required Reading

  • Inside the dark future Blade Runner 2049
  • WIRED's Blade Runner 2049 review
  • Director Denis Villeneuve remembers seeing the original Blade Runner for the first time

Related Video

Movies & TV

Blade Runner 2049 Director Denis Villeneuve on Seeing the Original for the First Time

Blade Runner 2049 director Denis Villeneuve reflects on seeing the original Blade Runner on the big screen and why he still loves the voice over version.

You already get a Star Wars movie every year. Star Trek is coming at you from at least two directions. A good chunk of the Marvel movies are basically space opera. Bigscreen fascination with science fiction and fantasy is nothing new—but now you can add the many flavors of TV network, from legacy broadcast to basic cable to streamers. Forget comic books; somehow, SF/F novels have become Hollywood’s hottest IP.

Some highlights of what may be on its way: Amazon is making Neal Stephenson’s Snow Crash and Larry Niven’s Ringworld, and a show based on Terry Pratchett and Neil Gaiman's Good Omens is in production. Universal is doing Hugh Howey’s Sand, Maggie Stiefvater’s Raven series, Kurt Vonnegut Jr.’s Sirens of Titan (with Community genius Dan Harmon at the helm), and Roger Zelazny’s Lord of Light, a.k.a. the movie that the CIA pretended to be making when they rescued American diplomats from Iran—a story that was itself the basis for the movie Argo. Done yet? Nope! HBO is making Nnedi Okorafor’s Who Fears Death? Netflix is making JG Ballard’s Hello America and Daniel Suarez’s Change Agent. And Lionsgate is making Patrick Rothfuss’ Kingkiller Chronicle series.

Phew.

Sure, some of these shows will never make it out of development; others won’t last a full season. Runaway successes like Game of Thrones and The Handmaid’s Tale make it worth the effort; Hollywood is nothing if not willing to keep doing what already worked. Besides, the saga of Westeros is about to wrap up, and then who'll take the reins of fire?

All those various networks and services have offered up plenty of original sci-fi, too, of course. Stranger Things is the obvious standout there. And I’ll do you a solid and point you at Travelers on Netflix, a wickedly clever time travel show from Canada, due for a second season this year.

Some deeper incentives might be behind the raiding of the science-fiction section, though. Familiar intellectual property has two advantages for a TV network. First, it’s already vetted. An editor with experience in science fiction has already made the sign of the IDIC over it and fired it out of a photon torpedo tube. Its characters, its world, and at least the skeleton of its plot live in the fictional universe.

As a consequence, TV makers don't need tea leaves—they can just look at Bookscan and get a sense of how big their built-in audience is. That’s never certain, and the number of people who read Snow Crash, genius though it is, probably isn’t big enough to turn it into a massive hit. But then again, viewership and ratings don’t have the same importance they once did, especially for streaming services. The Amazons, Hulus, and Netflices of the world are more interested in prestige, in must-subscribe programming, and—as a proxy for those other two things—maybe even in Emmys and glowing press coverage. (Quick counterpoint: Syfy’s The Expanse, likewise based on a successful series of books, hasn’t gotten the same attention or audience—which is too bad; it’s cool.)

Plus, genre often ends up being easier to market than "straight" or more literary drama. Take The Man in the High Castle, based on the Philip K. Dick novel. "What if the Nazis won World War II?" Pow. Now try to find a logline like that with, say, Atlanta.

Meanwhile, the larger multiverse of IP sources has become more constrained. The worlds of comic books and big sci-fi franchises are bottled up and decanted among various studios already. (That said, I’m super-psyched to see Greg Rucka and Michael Lark’s Lazarus and Ed Brubaker and Steve Epting’s Velvet comic books, from the creator-owned imprint Image, getting TV development deals, too. Lazarus is political, dystopian sci-fi and Velvet turns a Ms. Moneypenny-like character into a superspy on the run.) Demand for overall television content is high. Demand for genre content is high. The choices are: Either take a risk and make original stuff, or take slightly less of a risk with a known quantity from somewhere. Anywhere.

Movies used to be the only medium that would try to adapt something as ambitious as a book. In sci-fi and fantasy, sometimes that’d give you the Lord of the Rings trilogy (yay!) and sometimes it’d give you The Dark Tower (um). But now that premium television makers have realized that it’s easier to unspool the stuff books are best at—details, set pieces, emotional journeys—in six or 10 or 15 hours than in a mere two. TV’s looking for books; books work better on TV. For the kind of people who roll science fiction books on their Kindle like a development agent rolls calls during casting season, this middle is a very good one to be caught in.

Small-Screen Sci-Fi

  • Charlie Jane Anders on how The Expanse is transforming
    TV.
  • Adam Rogers on CBS' challenge in turning Star Trek into a
    streaming-exclusive prestige
    series.
  • Brian Raftery on The Handmaid's Tale transforming Hulu into a
    prestige-TV
    heavyweight.

Related Video

Movies & TV

Ridley Scott's Top 5 Sci-Fi Films of All Time

"The Martian" director Ridley Scott counts down his favorite sci-fi films of all time.

It's Not Just You. TV Has Hit Peak WTF

March 20, 2019 | Story | No Comments

*Note: This story contains spoilers for the current seasons of *Twin Peaks: The Return and American Gods.

Diane, I'm lost: For the last week and a half, I've been been wrapped in the spastic world of Showtime's Twin Peaks: The Return, the new 18-hour series from gleeful zig-zagger and noted nightmare sommelier David Lynch. Though only four hours in, The Return is already my favorite new series of the year, full of exquisite terrors, concussion-quick humor, and lovingly reintroduced old faces (surely I'm not the only fan who gets near-weepy whenever a Peaks pal like Bobby or the Log Lady wanders back into view after all these years). Here's a show in which Kyle MacLachlan gets to apply his parched comedic skills to, at last count, three different iterations of Dale Cooper; in which an akimbo-limbed tree speaks via a fleshy, bobbing brain-maw-thingee; and in which Michael Cera lisps his way through an impression of a Wild One-era Marlon Brando. Each episode of the new Twin Peaks is a dark hoot of the highest order, and I feel like Mr. Jackpots just for being able to watch them.

Related Stories

I should also mention that, for the most part, I have no idea what is going on with this show. There are a few plot outlines you can pick up with each installment—like all of Lynch's work, The Return has a forward momentum, even if it's not always discernible—but every new sequence in Twin Peaks sires at least a half-dozen questions: What was the space-floating metal contraption that Cooper visited? Who was the oil-slicked, ghost-like figure sitting in that South Dakota jail cell? Will we ever gaze upon Sam and Tracey's mysterious glass box again? (Speaking of which: Rest in Peaks, Sam and Tracey! Unless, of course, you're somehow still alive, which I suppose is very possible at this point!)

These are the kind of queries that might normally send one to the internet, where an ever-teeming team of culture experts—re-cappers, explainers, best-guessers—stand by every Sunday night at 10:01 pm, ready to tell you What Just Happened. In some ways, this kind of forensics-first approach can be traced back directly to Twin Peaks. In the early '90s, when the first series debuted on ABC, it was an instructive bit of deconstructionism: A network series made to be discussed and scrutinized, with Lynch and co-creator Mark Frost inspiring questions about the show's narrative (Who killed Laura Palmer?) as well as the medium itself (If a TV drama can get away with flashbacks and dream sequences, then what else can you get away with?). The modern web didn't exist when Twin Peaks premiered, but those sorts of thoughtful debates both predicted and informed the way we'd eventually talk about TV online.

But in the case of Twin Peaks: The Return, such well-intentioned hunches feel pointless—and, at least for me, totally joyless. Twitter and Facebook have made it possible for us to lob opinions and theories as soon the closing credits roll, but they've also made it tough to become giddily, utterly caught in the grasp of an immersive piece of art. Twin Peaks: The Return has that sort of lingering hold, in part because it's such a goofy marvel to behold, but also because the episodes never come to any sort of definitive conclusion. Instead, they simply float away on the back of a Chromatics synth or some Cactus Brothers harmonies, then remain lodged in your subconscious while you await the next Return. Asking "Wait, what's really going on here?" in the middle of this of ever-rare reverie feels like the exact opposite of curiosity. After all, the more you know what's going to appear in the big glass box—my lazy box-as-TV analogy, not Lynch's—the less face-chewed you'll be by the results.

Such let's-just-see-where-this-goes immersion is also one of the pleasures of the *other * current Sunday-night TV fun house: Starz's American Gods. Unlike Twin Peaks, Bryan Fuller and Michael Green's Gods—which depicts a planet-threatening power-struggle between its titular titans—is theoretically knowable, having been adapted from the 2001 novel by Neil Gaiman. For anyone who, like me, finds themselves occasionally confounded by the show's expansive cast, metaphysics-bending laws of nature, and pop-culture riffage, there are countless *Gods *primers and decoders available online.

Yet it's hard to imagine why anyone would even want to figure out everything that's going on in this gorgeous brain-poker of a show. On Gods, heroes and villains are introduced without hand-holding or hand-wringing; they simply show up and plunge into the action, their greater motivations sometimes not made clear for weeks, if at all. The world of Gods is less surreal than the world of Peaks, but it's just as ravishingly cuckoo: There's a bar-brawling leprechaun and a literally firey-eyed taxi driver; a morph-happy god, played by Gillian Anderson, who inhabits the form of Lucille Ball, Marilyn Monroe, and Aladdin Sane-era Bowie; and a roof-dwelling watcher who, at one point, plucks the moon out of an impossibly perfect nighttime sky. To get an idea of just how strange this show is, consider that one of American Gods' most relatively normal characters is played by Crispin Glover.

And, like Twin Peaks, the stunning-looking Gods casts a small-screen spell that's best left unexamined, in order for it to remain unbroken. The show has a hard-to-pin rhythm—some characters appear only in brief vignettes; others, like Ricky Whittles' Shadow Moon, can take over entire episodes—that allows Green and Fuller to make each episode as narratively elastic as possible. And every scene in Gods is saturated in a daring, beyond-garish visual palette that makes its most out-there moments (like a nighttime desert-sex scene between two fire-consumed, liquid-skinned men) all the more grandly hypnotic. To stray too far away from this version of Gods by wiki-seeking every still-undisclosed detail would be like pinching yourself awake during an especially dazzling dream. For now, the best way to enjoy both Twin Peaks: The Return and American Gods might be to surrender to their utter weirdness, maybe even try to savor it—ideally, with a piece of pie and some damn good coffee.

The question was simple enough: “Where the hell is Kendall Jenner with her Pepsi 12 packs?” As terror enveloped Charlottesville during the “Unite the Right” rally on Saturday, where hundreds of neo-Nazis and Klansmen violently mobilized in a mutinous showing of white pride to challenge the removal of confederate statues, that was all it took to resonate with tens of thousands. Though cloaked in humor, its meaning cut deep, the mockery of Pepsi’s ill-conceived April TV ad distilling how racial and social harmony in America remains something of a joke.

The reaction was neither unexpected nor the exception—on Twitter, the binding of anguish, cynicism, and satire has become a shared lingua franca in the wake of national torment.

I spent the weekend in Cleveland, mostly offline, celebrating a friend's wedding. What news I did consume, as I attempted to make sense of the Charlottesville protest turned melee, came via Twitter: images depicted ghoul-faced white men hoisting torches in medieval defiance, of bodies being launched into the air, and of extremists bludgeoning Deandre Harris, a young black man, as he fell to the concrete. There were people who assembled to honor the life of Heather Heyer, the 32-year-old woman killed during the rally’s fallout. Tweets, too, memorialized the names of individuals like Heyer who were victims of recent incidents grounded in gruesome racial contempt.

In spite of Twitter's capacity to rapidly disseminate news like few other digital rostrums, many of its critics (and users) fear that the social hub has largely become a cesspool of snark and nonsense; “everyone is bad” and “never tweet” are frequent rejoinders to the madness that the platform has been known to spur. In the wake of an occurrence like Charlottesville, one of such ferocity—when deep hatred, out-and-out racism, and death foment in disastrous concert—Twitter becomes a conduit for externalized grief. My feed turned into a tempest of trauma and humor, an emotional battleground yoked by what writer Hilton Als once solemnly described as “a reality I didn’t want to know.”

Rampant denial of America’s past—with its history of slavery, redlining, and mass incarceration— led to tongue-in-cheek tweets about shallow white allyship and President Trump’s equivocating “on many sides” statement. In the aftermath of homegrown extremism or tragedy, a new contagion often emerges, built on innocence and patriotism; stock expressions like “I can’t believe this is happening” or “We are better than this” become a handy, if fictitious, narrative. But on Twitter, such sentiments are relentlessly met with a chorus of ridicule and truth.

There’s also the exhaustion many feel at having to address white ignorance in moments of racial conflict.

The jokes, and their surprisingly codified form, are all part of the Twitter cycle—the news and the grief never outweigh or dilute the humor, and vice versa. And for many of Twitter’s marginalized participants, be they black or gay or female, humor functions as a necessary safeguard. “The absurdity of reality—the only way for many people to deal with that is satire and comedy,” says Charisse L’Pree, who teaches media psychology at Syracuse University.

L’Pree believes microblogging platforms like Tumblr and Twitter have become overwhelmed with people’s desire to share, which often escalates in times of crisis. “This is a process by which we cope with tragedy when there is a very visible white power movement happening,” she says of tweets like Jenkins’. But she is careful to clarify: “It’s a pretty standard reaction; we just see the evidence more because of social media. But the frequency of retweets demonstrates that what this person is saying is actually reaching and resonating with people.”

How we’ve learned to shoulder the terror, in the real world and the world we construct online, is perhaps the most telling. The footage and first-hand accounts out of Charlottesville have been at once crushing and completely unsurprising, a reflection of an America many have endured before and are likely to endure again. For those geographically removed from the mayhem, Twitter acts as a necessary remedy. It is a coarse statement, but a no less true one: humor soothes.

Early Sunday morning, I came across an image on my timeline. The photograph had been taken a month before, at a similar "save the statue" Klan rally, but it couldn't have felt more timely. A black man in jean shorts sits somewhere in the vicinity of the protest, smoking a cigarette. He looks untroubled, calm. Wholly unbothered. In his hand he holds a sign that reads “Fuck yo statue.” I laughed for a second, thinking of Chappelle’s Show’s watershed sketch about funk musician Rick James, and then I did what I always do. I kept scrolling.

The anatomy of a Little Simz song doesn’t offer itself up easily. On the recent “Good For What,” a whir of tough North London bristle, the rapper born Simbiatu Ajikawo contemplates early successes. “Look at young Simbi in Vogue/Look at young Simbi in Forbes,” she says in the video, merrily skating through the streets of Los Angeles. “Well someone’s gotta do it right/Someone’s gotta open doors.” As British rap has found more footing in US markets over the last handful of years, taking a bigger share in the international mainstream—partially owed to a greater cross-cultural exchange among artists themselves—Simz has come to represent how music can best travel among divergent cultures in our increasingly globalized world.

One argument, familiar to anyone privy to the nativism of Donald Trump and his ilk, contends that globalization actually dilutes local cultures. In popularizing the customs of a given community, the thinking goes, these things in some way lose their truer essence. We've seen that argument play out with Drake, whose ardent obsession with UK culture has granted artists like Jorja Smith, Skepta, and Giggs more mainstream visibility in North America. Simz’s rise is, in part, a rebuke to that thinking, taking the independent route with the creation of her own label: she proves that the best conduit can still be the self, even when it’s away from home.

Metaphor or not, “Good For What” finds the young Brit looking back even as she pilots forward, taking solace in the palm-treed environs of Southern California but still every bit the girl who grew up under the metal skies of Islington. The music, all backbone and unflinching emotional lucidity, may have changed locations, but it’s remained unmistakably Simz, a posture that is no mere performance. “I was made for this shit,” she declares, over Astronote’s murky, atmospheric production. And later attests: “Cause this is bigger than you thought/Thought I was finished, let me give you more.”

“More” has never been a problem for Simz. She is a covetous creator, having collaborated with artists like reggae revivalist Chronixx, Rihanna songwriter Bibi Bourelly, and soul experimentalist Iman Omari; toured with the Gorillaz; and dropped 11 projects since 2010 (a blend of albums, mixtapes, and EPs). There’s been no disconnect in Simz’s presence stateside, either. When she initiated one of the several freestyle cyphers at the BET Hip-Hop Awards in mid October, she did so as the only black woman artist hailing from the UK. With a pinch of English cool, the 23-year-old rapper spoke of her adolescence and the tirelessness it took to overcome the likelihood of turning into just another cultural data point. “Who’d thought this would happen/ teachers would tap me funny when I said I’d make it from rapping,” she offered in the minute-long verse.

Though the cypher included brash ascendants like Detroit's Tee Grizzley and Atlanta sing-rap polymath 6lack, Simz held court like a seasoned pro: dynamic and levelheaded, if exceedingly expeditious in her layered delivery. Her authority carries little surprise to anyone who has followed the young rapper's continued climb, gaining traction in the US since issuing her seductively ruminative E.D.G.E. EP on SoundCloud in 2014 (the breakout track “Devour” has since amassed 3.65 million streams).

Still, the most radical element of Simz’s arsenal may be her grandiosity. A song like “Good For What”—with its puffed-up moxie and tales of shrewd diligence—provides another roadway into her appeal by better refining the many avatars she dons so effortlessly, accentuating the social realities of black women. Simz’s sustained output has also allowed her to be even more elastic in her selfhood. There is a vulnerable intensity alive in her work; it satiates but jars the soul, lines so ordinary you forget how much power they hold in one’s own life. “My imperfections make me who I plan to be” she sang on “Doorway + Trust Issues,” from January’s Stillness In Wonderland (the deluxe edition, which features seven new songs including “Good For What,” releases November 4).

The final shot in the video for “Good For What” zeroes in on Little Simz, standing by herself in the middle of a nondescript LA street, the line “Look at me, once again I was made for this shit” looping in the background. The message is unmissable: no matter where she’s at, it’s best we leave the translation up to her.

More From the Author

Let's just get the weird part out of the way: I'm typing these words on an invisible computer. Well, kind of. There's a visible laptop open on the corner of my desk, but only so my (also not invisible) keyboard and mouse can plug into it. But the window containing these actual words I’m writing? That's just hovering in midair, directly in front of my face, equidistant between me and the now-dark, real monitor I usually use at work.

To be honest, though, right now I’m a little more interested in the other window hiding behind this one—the one with last night’s NBA highlights all cued up and ready to help me procrastinate. So I reach out with my hand, grab the work window by its top bar, move it out of the way, and commence watching Victor Oladipo bury the San Antonio Spurs. Even better? Since I’m the only one who can see these windows, my next-desk neighbor doesn’t know exactly what I’m doing. To her (hi, Lauren!), I’m just the idiot sitting there with a space-age visor on, making grabby motions in midair.

This is the vision of “spatial computing,” an infinite workspace made possible by augmented reality. And while my workspace at the moment isn’t quite infinite, it still stretches across a good part of my vision, courtesy of the Meta 2 headset I’m wearing. There’s a window with my email, and another one with Slack, just so I know when it’s time for me to jump in and start editing a different piece. The question is, is the idiot sitting there in his space-age visor able to get all his work done? That’s what I’ve spent the last week trying to figure out. Spoiler alert: he isn’t.

But the experiment also suggests a different, more important question: will the idiot in his visor be able to get all his work done in it someday? That’s the one that has a more hopeful answer.

If virtual reality’s promise was bringing you inside the frame—inside the game, or the movie, or social app, or whatever screen-based world we’ve always experienced at a remove—then AR’s is turning the whole damn world into the frame. The virtual objects you interact with are here now, in your real-life space, existing side-by-side with the non-virtual ones. While at the moment we’re mostly doing that through our phones, we’re on the cusp of a wave of AR headsets that will seek to turn those pocket AR experiences into more persistent ones.

Meta 2 is one of those headsets; at $1495, it poses an interesting threat to the far more expensive Microsoft Hololens, as well as the who-knows-when-it’s-coming Magic Leap headset. (Despite the three using differing marketing language—"augmented," "mixed," "holographic"—they all basically do the same thing.) It’s still a developer kit, though Meta employees are quick to tell you that they use theirs every day at work. But while lots of non-employees have gotten a chance to see what the Meta 2 can do in the confines of proctored demonstrations, not many outside the developer community have had the luxury of an extended multi-day visit with the thing. I have. And I’ve got the enduring red forehead mark to show for it.

This isn’t a product review, so I’m not going to take you through the specs of the thing. Here’s what you need to know: Its field of view is significantly larger than the Hololens (which can sometimes feel like I’m bobbing my head around looking for the sweet spot that lets me see the virtual objects fully), and its lack of a pixel-based display—there are twin LCD panels, but they reflect off the inside of the visor—means that visuals appear far sharper at close range than VR users might be used to. Text is more readable, images more clear. In theory, it’s perfect for the kind of work I do as a writer and editor.

The Meta 2 uses an array of outward-facing sensors and cameras to map your physical surroundings, and then use it as a backdrop for everything you do in the headset. That means that if you push a window allll the way behind, say, your computer monitor, it should effectively disappear, occluded by the real-world object. The key here is should: like many of the Meta’s most interesting features, it’s inconsistent at best. The mouse pointer would sometimes simply disappear, never to return; until the company pushed a software update, the headset refused to acknowledge my hand if I was wearing a watch; it wasn’t uncommon for the headset to stop tracking me altogether.

The headset’s software interface, called Workspace, is a bookshelf of sorts, populated by small bubbles. Each represents a Chrome-based browser window (albeit a minimal rendition, stripped of familiar toolbar design) or a proof-of-concept demo experience—and maybe soon, third-party apps. To launch them, you reach out your hand, closing your fist around it, and dragg it into free space. (Hand selection was an issue throughout my time with the headset; if I wanted a no-second-takes-necessary experience, I generally opted for a mouse.) There’s a globe, a few anatomical models, a sort of mid-air theremin you can make tones on by plucking it with your fingers, and…not much else. That’s not necessarily a concern; this may look and feel like a consumer product, but its only real purpose is to get people building apps and software for it.

But as a writer and editor who was ostensibly using it to replace his existing setup, I simply didn’t have the tools for the job. Meta’s current browser is based on an outdated version of Chrome, meaning that using Google Drive was out—both for writing and for syncing with a any other web-based text editor. The headset allows a full “desktop” view of your computer, but anything you open in that view takes a big hit in clarity; editing in Word, or even in a “real” web browser, wasn’t worth the eyestrain. Did I enjoy having a bunch of windows open, and moving them around on a whim? Of course. Did I like the fact that I could do my work—or not—without prying eyes knowing I was agonizing over yet another sneaker purchase? God, yes. But for day-to-day work, the "pro" column wasn't nearly as populated as the "con."

Every company working in this space rightfully believes in the technology’s promise. Meta even partnered with Nike, Dell, and a company called Ultrahaptics, which uses sound to create tactile sensations (yes, really), to create a vision of the future that makes Magic Leap’s promotional pyrotechnics look like a used-car commercial.

But this isn’t just augmented reality; it’s not reality at all. At least not yet. Certainly, augmented and mixed reality is well-suited to fields like architecture and design; being able to manipulate a virtual object with your hands, while still sitting or standing with colleagues in the real world, could very well revolutionize how some of us do our jobs. But for now, most of AR’s professional promise is just that. Even a diehard Mac user can get used to a Windows machine, but until object manipulation is rock-solid, until the headset is all-day comfortable, and until there’s a suite of creative tools made expressly for AR, rather than just seeking out web-based workarounds that may or may not work, then for now it’s simply a fun toy—or at least a shortcut to looking like a weirdo in the office.

In a couple of years' time, though? That's another story. As with VR before it, the AR horse left the barn ages ago; there's so much money flowing into it, so much research flowing into it, that significant improvement is only a matter of time—and not much time at that. So don't take my problems with a developer kit as a doomsday prophecy; think of it like a wish list. And right now, I just wish it could be what I know it will.

Related Video

Culture

AR, VR, MR: Making Sense of Magic Leap and the Future of Reality

The age of virtual realty is here but augmented reality and its cousin mixed reality are making strides. WIRED senior editor Peter Rubin breaks down the new platforms.

Edgar Wright's Baby Driver begins the way most capers end: three goons pulling off a bank heist, then their getaway driver leaving the cops in his rearview. Unlike most capers, though, the escapade goes down to the pulse-pounding strains of The Jon Spencer Blues Explosion's "Bellbottoms." It’s a whiplash-inducing rush that lays more than a dozen cars to waste and doesn’t let up for the song’s entire 5:16 runtime, each screeching turn and crash perfectly timed to the track's churning rhythm. And it ought to be—Wright’s been plotting it ever since he heard "Bellbottoms" in 1995. “That moment was the closest I’ve ever come to synesthesia,” the writer-director says. “I would listen to that song and start visualizing a car chase.” As a result, he made a movie perfect for a song, instead of finding a song perfect for his movie.

What’s incredible, though, is that when he dreamed up Baby Driver—arguably the first film to make the iPod a central character—most people were still making each other mixtapes. Back when “Bellbottoms” was released, folks couldn’t dance down the street with 3,000 songs at their disposal. Even with the 15 you could cram onto a single side of a cassette, cuing up the right track at the right time was just about impossible. Portable CD players helped deliver music faster and more accurately, but they skipped constantly if you tried to walk with them having anything more than the slightest spring in your step. iPods and their non-Apple mp3-playing ilk changed all that, allowing hundreds of hours of music to be stuffed in your pocket ready to be cued up during just the right moment.

“For the first time, which wasn’t the case with the Walkman or the Discman, the iPod meant people could basically start soundtracking their own lives,” says Wright.

That’s what getaway driver Baby does throughout the movie. Looking to fulfill a debt to the crime boss Doc (Kevin Spacey), he times every heist to a specific song. A sufferer of tinnitus, he needs the music focus on the road—to drown out the hum in his ears and the chaos around him. That leads to some exquisitely crafted car chases, but it also leads to some moments more relatable to folks who don’t know how to execute proper donuts. Like, for example, the time Baby goes on a coffee run set entirely to Bob & Earl’s “Harlem Shuffle.” We've all had that moment, all known instinctively based on the weather, mood, or activity just what song to cue up, then timed every footstep or lane change to that said song. For music fans, getting it right feels like high-fiving a million angels. For a director like Wright, making a movie that way is downright genius.

Related Stories

Lots of directors make movies with soundtracks in mind—and often have songs in mind when they develop their films. Quentin Tarantino and Cameron Crowe are both known for this, but Wright took it a step further, timing scenes to the songs he knew he was going to use. Instead of syncing the action to a track as the movie was being edited, it was shot to be beat-for-beat—just like we all do when we time our morning run to Beyoncé’s “Ring the Alarm” (or, you know, whatever is on your sprint mix).

“When life starts to sync up with your soundtrack, it’s a magical moment," Wright says. "If it’s something where you’re walking and it’s cloudy and the sun comes out in time with a bit in the song you feel like you’re omnipotent—so Baby Driver is an entire movie made up of moments like that.”

As a result, Baby Driver feels like a rollicking action movie that just happens to plays like the fantasy everyone has experienced when the bassline of their favorite song times perfectly to their footfall—it’s just that when Baby’s foot falls its dropping the pedal to the floor.

The next time you watch George A. Romero’s classic 1968 creepshow Night of the Living Dead, do your best not to look away. It won’t be easy, as the zombie-zeitgeist-defining shocker—filmed in stark black-and-white, and populated with terrifyingly dead-eyed human-hunters—still has the power to unnerve, nearly 50 years after its release. But take a closer look, and you might get a sense of just how much low-budget derring-do and luck was involved in making one of the most epochal horror films of all time. “There’s a copy of the script visible in one of the frames!” Romero told the New York Times last year. “I won’t tell where. It will be a little challenge for fans to spot it.”

Romero, who died Sunday at the age of 77 after a brief bout with lung cancer, directed several smart-schlock joy-rides during his decades-long stint as a director and writer, including 1973’s bio-shock thriller The Crazies, the 1978 blood-sucking drama Martin, and 1982’s lovingly yucky comics adaptation Creepshow. But his long-running career was always better off dead, thanks to a series of socially minded zombie movies that began with Night—which Romero and a bunch of friends shot in rural Pennsylvania at a reported cost of just $114,000. (The production was so commando that, at one point, a member of the film’s production team borrowed his mother’s car to shoot a scene, and wound up smashing the windshield; he repaired the damage before she could find out).

Romero didn’t invent the zombie movie, but he did reanimate it, using cheap-but-effective effects, patient camerawork, and amateur actors to give the movie an almost documentary-like urgency. As a result, Night went on to earn millions at drive-ins and college theaters across the country, making it one of the biggest independent smashes of the century, and a clear influence on everything from 28 Days Later to World War Z to The Walking Dead.

Related Stories

Still, it wasn’t just the movie’s eerily determined flesh-eaters that made Night of the Living Dead a hit; it was the suffocating on- and off-screen mood it captured. Released during one of the most divisive and paranoia-prone years of the ’60s, Night climaxed with a harrowing finale, in which an African-American survivor (played by Duane Jones) survives a bloody night of zombie-fighting—only to be shot dead by a white gunman. It was a savage bit of social commentary snuck into a midnight movie, and though Romero maintained that the movie wasn’t supposed to be a racial allegory, Night nonetheless proved that horror films were an ideal vessel with which to examine the nightmares of our real world. As Get Out writer-director Jordan Peele noted earlier this year, “the way [Night of the Living Dead] handles race is so essential to what makes it great.”

Romero would make five follow-ups to his shambling breakout, and though 1985’s barf-cajoling Day of the Dead and 2005’s Land of the Dead were both gross, groovy B-movie fun, his greatest work was 1978’s Dawn of the Dead, which followed a team of survivors as they hid out in a zombie-infested shopping mall. Dawn’s anti-consumerism message was a hoot, and the movie could have functioned as pure camp alone—except it remains downright terrifying, full of anatomically detailed gore (all hail gross-out king Tom Savini!), believably desperate heroes (and baddies), and hordes of lurching, formerly luxury-seeking zombies whose dead-eyed lust for more, more, more! seemed awfully human. Dawn comes to mind every time I drive by an empty shopping center, or gaze at a dead-mall photo gallery, and it proved that few other filmmakers understood our base desires—and their ruinous effects—quite as well as Romero. "When there's no more room in hell,” a character says in the movie’s most famous quote, “the dead will walk the Earth." Thanks to Romero, we all got to experience that hell from a safe, scared-brainless distance.

Related Video

Movies & TV

World War Z: Building a Better Zombie Effects Exclusive

WIRED's exclusive behind the scenes look at the making of "World War Z" reveals how the visual effects artists at MPC used massive crowd simulations and hand animation to create the devastating swarming of Jerusalem by a zombie horde.

Just over a month ago, Fusion reporter Emma Roller did exactly what the far-right internet wanted her to do: she believed 4chan. For months, the online message board had been engaging in an informal propaganda operation, discussing innocuous gestures and symbols as though they were secret signals among white-supremacist groups. The "OK" symbol emerged as a favorite of those gestures, and the ultra-far-right media got in on the joke. So when Roller saw an image of self-described "national security reporter" Mike Cernovich and his colleague Cassandra Fairbanks doing it, she retweeted it—along with the message, "just two people doing a white power hand gesture in the White House."

Now she's facing a defamation lawsuit. While both Cernovich and Fairbanks have been open about intentionally participating in the troll, Fairbanks—at the time an employee of Kremlin-owned site Sputnik, and now working for the Breitbart-alum run Big League Politics—is now suing Roller for falsely claiming that she's a white supremacist. (Roller's tweet has since been deleted.)

If you've spent much time on social media, you've seen this tactic before: someone trying to slip out of their rhetorical bind by claiming that their offending statement had been a joke, and that you're just being hypersensitive. That thing where someone wears irony as a defense, hiding their true motives? What is that?

That, friend, is Poe's Law: On the internet, it's impossible to tell who is joking. In other words, it's the thinking person's ¯\_(ツ)_/¯. But Poe's Law isn't only useful as a defense against pearl-clutching reactions. It's also a diagnosis of exactly how the troll mentality has weakened internet culture. If nobody knows what anyone means, then every denial is plausible.

A degree of ambiguity has always been baked into internet exchanges. "Usenet in the '80s, community guidelines would often indicate that it was hard to tell if someone was being silly or sincere," says Ryan Milner, author of The World Made Meme. But the first person to codify this particular digital phenomenon was a user calling himself "Nathan Poe," on a creationist forum in 2005. During a discussion about perceived flaws in the theory of evolution, people began bemoaning how difficult it was to divine if participants were for real. Then Poe (who has never been identified IRL) posted this axiom: "POE’S LAW: Without a winking smiley or other blatant display of humor, it is utterly impossible to parody a Creationist in such a way that someone won’t mistake for the genuine article."

Since then, the concept has grown far from its creationist origins. "It's a big part of the culture of collective spaces like 4chan or Reddit, where people don’t know each other interpersonally and you can't gauge intention," Milner says. There's even a subreddit called /r/poeslawinaction ____ where users document and discuss ambiguous internet moments. Take, for example, 4chan's infamous /pol/ (short for "politically incorrect") board, where people routinely post obscene and hateful content. Guess how they justify their actions:

It's also spawned pictorial memes of its own.

But because the internet has changed in innumerable ways since 2005, expanding and accelerating all the while, Poe's Law applies to more and more internet interactions. “When social networks used to be bounded by interests, the joke teller could expect that their audience was in on the joke," says Whitney Phillips, author of This Is Why We Can’t Have Nice Things: Mapping the Relationship Between Online Trolling and Internet Culture. "Now a single retweet can cause spontaneous global amplification.” That's a lot of people who don't have context for your in-joke.

Meanwhile, 4chan's reach has continued to grow, either through its own profile—thanks to ____ hand-wringing news stories—or the emergence of the alt-right and other troll-leaning groups. With Poe's Law serving as a refuge for more and more scoundrels, it has effectively been weaponized, like so many other memes.

To wit: Breitbart tech editor and "provocateur" Milo Yiannopolous, who clearly understands the escape hatch that ambiguity offers from accusations of prejudice. Yiannopoulos took to Breitbart to make this Poe-etic claim: "Are [the alt-right] actually bigots? No more than death metal devotees in the 80s were actually Satanists. For them, it’s simply a means to fluster their grandparents." He used similar logic to justify actions like leading a racist harassment campaign against actress Leslie Jones, which culminated in his ousting from Twitter. And to plenty of people, that just looked like Twitter not being able to take a joke.

4chan's /b/ board—the original home of 4chan's trolls and their shitposting shock tactics—has an internet-famous boilerplate to the same effect: "The stories and information posted here are artistic works of fiction and falsehood. Only a fool would take anything written here as fact." (So you can imagine 4chan's delight at mainstream journalists like Roller taking their activities seriously despite the disclaimer.)

Poe's Law doesn't end online, either. “People talking about 'spin in the era of Trump' and 'post truth' don’t talk about politics in terms of Poe’s Law," Phillips says. "But it's there, whenever you’re not sure if you should be mad or just roll your eyes.” It's as present in Julian Assange stoking the Seth Rich conspiracy or Kellyanne Conway's "kidding" about telling people to buy Ivanka Trump's clothing as it is in YouTuber PewDiePie's attempts to justify racism as satire.

Just being able to name the phenomenon makes it a little easier to ¯\_(ツ)_/¯ off the little stuff. But it also points to less apathetic way forward. Once you know it's there, Poe's Law begins to lose its potency—especially as a shield for trolls to hide behind. As Milner says, "When all you have are people’s words, then the words and their effect are all that matters."

Related Video

Culture

How To Battle Trolling Ad Hominem Attacks Online

An internet troll's favorite way to argue? Ad hominem, of course! This is your guide to spotting bad arguments on the internet and how to fight them.

Last year the self-professed “communist farmer” and “saxophone kisser” known as Oedipus uploaded a song to SoundCloud, the artist-first music streaming platform that launched in 2008. Titled “Please B Okay”, it was a bright horn-driven melody that sampled vocals from Japanese soul-pop singer Taeko Ohnuki’s 1977 album Sunshower. I first came across the song because I’d been having an atypically unfavorable week and a friend messaged it with the hope of it being a momentary cure-all. “This instantly makes me feel better,” she said. Looking back, her remark shouldn’t have been a surprise: Oedipus had tagged the two-minute track as “selfcarecore”—a sure nod to its calming, feel-good properties.

By conventional industry standards, selfcarecore is not an established music genre, but it carries significance just the same. On SoundCloud, genres thrive on amorphism, defined more by a song’s uncompromising sentiment—rage, anxiety, body-rolling euphoria—than the pulse of the beat or musical composition. (A casual listener might be inclined to label “Please B Okay” as simply house music.) This has given the Berlin-based platform a unique advantage not just in breaking unknown talent but in becoming a breeding ground for experimental sounds.

A cursory scan of the streaming service reveals a deluge of genres: from kawaii trap and Nu Soul to
swooz, broken beat, and
stresswave. There are also artists like Sugg Savage, an ascendant Maryland newcomer who’s creating some of the best music of the moment and has become something of a Picasso in this regard: She has mothered genres as disparate as lowkey gospel (“Let’z”), spirit bounce (“Funk Bounce”), midnight boogie (“Party Dawg”), and bleep bloop blop pop (“Fill In The Blank”). Collectively, her songs could fit somewhere within the expanse of R&B, but a truer estimation of her work shows how each song belongs to a singular classification. “Let’z” draws from a multitude of sources—a Motown-soul-meets-Chicago-juke jambalaya of sonic bliss—but its core is imbued with the essence of gospel music: uplift, faith, a dogged optimism. “You better know he’s got a plan for you,” she croons just before the song’s conclusion, a sweetly sung aphorism that could just as easily have been pulled from the Bible. Hence: lowkey.

In early July, SoundCloud was reported to be on its last leg, having laid off 40 percent of its workforce in a move that, at best, felt reasonably apocalyptic. But cofounder Alexander Ljung remained confident, saying the digital music service, which had hoisted cultural forces like Chance the Rapper and Lorde to national audiences, was still solvent. SoundCloud was “completely unique,” he said. “You can can find artists there that don’t exist anywhere else. Many are the next ones to accept Grammys. There is tremendous financial and cultural impact on SoundCloud. It will stay strong.”

In spite of the company’s nebulous future, Ljung was right about one thing. The platform is an utterly one-of-a-kind domain. If SoundCloud set out to build a business model on community-oriented music streaming for DJs, musicians, nascent podcasters, and mixed-media artists, it soon reflected that plurality in every regard, a network whose parameters seemed borderless. There was a sound for everybody. Stresswave not your thing? Try chillwave. Or funk wave. Or future wave. You would be hard-pressed to find a platform that has allowed for organic discovery as seamlessly as SoundCloud. A handful of my current favorite artists—Nick Hakim, Kwabs, Sonder, and Kaytranada, whose novel flip of Janet Jackson’s enduring ballad “If” I still spin weekly—I first came across on the platform.

In time, too, the service fulfilled its own pledge, becoming a genre itself: SoundCloud rap, or what The New York Times recently deemed “the most vital and disruptive new movement in hip hop thanks to rebellious music, volcanic energy, and occasional acts of malevolence.” It’s a sound predicated on dissonance that prioritizes “abandon over structure, rawness over dexterity” and has been adopted by a crop of Florida upstarts, including Smokepurpp and Lil Pump, whose track “Molly” was one of the highest-played on the platform this month.

Like Chance the Rapper and Lorde before him, the Philadelphia-born rager Lil Uzi Vert parlayed his titanic success on SoundCloud—the company anointed him its most followed artist of 2016—into mainstream sustainability, a major-label deal, and a headlining tour. I shadowed Uzi for a day last summer, from the edges of New Jersey into the heart of the Lower Manhattan, just moments before he took the stage for a sold-out show. He’s been compared to Young Thug and Lil Yachty, rap eccentrics who have come to define a nontraditional hip hop sound. Currently, Uzi’s “XO Tour Llif3” ranks as the fourth-most-played song for the final week of July—coasting just above 1.4 million listens in the span of seven days. It’s a hazy melody about suicide, substance abuse, and his rocky relationship with his ex-girlfriend. But it’s not rap. Not really. On his SoundCloud page, Uzi tagged the song “alternative rock.”

Related Video

Gadgets

How Hip-Hop Producer Steve Lacy Makes Hits With … His Phone

Steve Lacy is a pretty big deal. He's part of the band The Internet, he's a producer for J. Cole and Kendrick Lamar, and he just put out his first solo album which he made on his iPhone.