Author: GETAWAYTHEBERKSHIRES

Home / Author: GETAWAYTHEBERKSHIRES

Ana Lily Amirpour became a celebrated filmmaker her first time out. Her debut feature, 2014's black-and-white Iranian vampire flick A Girl Walks Home Alone at Night, got her more than Sundance buzz—it got her a deal to make a second feature with Megan Ellison’s Annapurna Pictures and Vice. That film, The Bad Batch, hits theaters this weekend. It’s … weird. Keanu Reeves plays a new-age messiah who gives monologues about poop; an unrecognizable Jim Carrey shows up as a hermit who says nothing; Jason Momoa, future Aquaman, features prominently as a cannibal. The dystopian romance ain't, as the saying goes, for everybody.

Not too surprisingly, the movie's critical response has borne that out. Whereas A Girl Walks scored an impressive 95 percent on Rotten Tomatoes, Amirpour’s follow-up is only currently pulling 45 percent. Some praise its idiosyncratic vision, others decry its lack of coherence or substance. But Jessica Kiang, writing for The Playlist, nailed what might be The Bad Batch’s biggest shortcoming. “The perils,” she writes, “of the broader-canvas follow-up to the sleek and economical indie debut are writ large: This is Difficult Second Album: The Movie.”

Related Stories

The sophomore slump has always been the worst kind of self-fulfilling prophecy. Any artist who achieves first-timer success invariably finds themselves hamstrung by a creative paradox: Their second effort gets more resources, sure, but also more scrutiny and expectation—and a lot less anonymity. (That added weight is often compounded for women and artists of color, who are much less likely to get a third or fourth chance.) Some filmmakers use their newfound capital to direct a big blockbuster, though that endeavor can pay off (Gareth Edwards' Godzilla) or flop (Josh Trank’s Fantastic Four) in equal measure. Others, like Amirpour, take the opportunity to indulge their weirdest impulses, like a movie about a young woman wandering the Texas wasteland who falls in with a group of cannibals.

Whatever the outcome, it’s essential that directors like her get to indulge their weirdest cinematic fantasies—even if they’re not for everyone. Indulging strange impulses every so often can prove highly beneficial. After Steven Soderbergh won the Palme d’Or at the Cannes Film Festival for sex, lies, and videotape, he made the literally Kafka-esque film Kafka. It was black-and-white and batty, but a few years later Soderbergh was lining up smart crowd-pleasers like Out of Sight and Ocean’s Eleven. Shane Carruth followed up Primer with Upstream Color. Spike Jonze cranked the meta-volume of Being John Malkovich past 11 with Adaptation; Diablo Cody, who became a critical darling after writing Juno, followed it up with Jennifer’s Body, which turned Megan Fox into a blog-speaking succubus. Not all of these efforts were praised, but they all proved to be turning points that helped their creators figure out where they would ultimately go—visionary or visualist, populist or pop-art. (Amirpour won’t divulge what her next movie is about, at least not concretely: “for as much as Bad Batch let me explore some of the shittiest things about us people, the next one lets me look at some of the better things, some of the things that really inspire me about how good we can be. Sometimes.”)

Amirpour isn't the only director dealing with this phenomenon right now. Just ask Colin Trevorrow, whose Sundance-hit debut Safety Not Guaranteed snared him the director's chair on Jurassic World—and subsequently Star Wars: Episode IX. Last week, his film The Book of Henry hit theaters, and thudded its way to a 23 percent on Rotten Tomatoes; even usually-forgiving Rolling Stone scribe Peter Travers called it a “mess of conflicting ideas.” The fallout was immediate and alarmist, with Vulture even questioning if Henry would put Trevorrow’s Star Wars job in jeopardy.

It wouldn't, obviously, but it still led to some soul-searching for Trevorrow. As the critical takes started rolling in, he did an interview with the Empire Podcast wherein he called the bad reviews “heartbreaking,” but also acknowledged he’s under a microscope now as the guy with his hands on both the Jurassic Park and Star Wars movie franchises. "What I may have underestimated is how my visibility as somebody who is responsible for two things that we all care about deeply, and are massive parts of our public consciousness and shared mythology—how that level of visibility would shine a spotlight that I hadn't considered," he said.

The sophomore slump, then—well, in this case, the junior jag—can be valuable not only for creative catharsis, but for learning how to handle the public’s perception of your work. Richard Kelly still seems haunted by the reaction Southland Tales, his post-apocalyptic follow-up to Donnie Darko received. Neill Blomkamp has said he was put in “a very strange place” by Chappie's poor reception, even though the movie “crystallized or congealed ideas in my head in a good way.”

Amirpour, too, is experiencing that feedback loop—and for her, it goes deeper than audiences not just understanding her movie. During a Q&A at a recent Chicago screening of The Bad Batch, an audience member asked Amirpour what message she was trying to convey by having black characters die gruesome deaths in the film. The director responded, “I don’t make a film to tell you a message.” The exchange, and a series of subsequent Twitter threads, showed Amirpour is still learning how to contend with criticism. “I could have a conversation with people, but if someone’s hurling insults at you, let me just say, at the end of the day, I have feelings. You’re going to call me a racist or something, you think I’m not going to have feelings?” she says when asked about the exchange. “I don’t know what to say other than maybe Twitter is not a good place for me.”

For all of the internet's power to rehabilitate the image of once-overlooked pop culture, it's not so kind to esoteric new releases. Social media acts as an instant funhouse mirror for movies like The Bad Batch or Book of Henry, reflecting multiple versions back to their creators—some kind, some grotesque. Which of those depictions people will remember is based entirely on what their creators do next; to treat early experimentation as failure, though, dooms a movie's legacy before its influence has a chance to manifest.

Related Video

Culture

The Director of ‘Jurassic World’ on Tackling the Beloved Franchise

When Colin Trevorrow saw Jurassic Park as a teenager it sparked his interest in the power of film. He spoke with WIRED about taking on the franchise and getting the audience to cheer for their favorite dinosaurs.

Beginning to think that, post-D23 and San Diego Comic-Con, we wouldn't get any new information about Star Wars: The Last Jedi until it hits theaters this December? Then you hadn't considered the importance of publishing realities, with Entertainment Weekly dropping all kinds of fact bombs about the next installment of the saga from a galaxy far, far away. Meanwhile, Lando Calrissian is causing trouble, and the backstory of Rogue One turns out to raise an ethical conundrum that few people had really thought about before. Thank you for tuning into the latest update on the HoloNet, and please remember to tip your Bothan.

Never Meet Your Heroes

The Source: Entertainment Weekly's massive Last Jedi preview

Probability of Accuracy: Consider this one more of an intentionally vague teaser than an accurate piece of information. But what a tease…!

The Real Deal: For those expecting the Rey/Luke meeting in Star Wars: The Last Jedi to be a reprise of Luke's meeting with Yoda in The Empire Strikes Back, then prepare to be disappointed. An Entertainment Weekly story—one of many this time around, considering they had a lot of spoiler-filled previews for the new movie—revealed that Rey finds Luke when his faith in the Force is at the lowest anyone has ever seen. Daisy Ridley described her character's response as, "Oh my God, this other man that I lost within a couple days was somewhat of a father figure. Now he’s gone, and instead I’m with this grumpy guy on an island who doesn’t want me here."

As for Mark Hamill, he seems as if he's trying to come to terms with what's happened himself. "The fact that Luke says, 'I only know one truth. It’s time for the Jedi to end…'" he said. "I mean, that’s a pretty amazing statement for someone who was the symbol of hope and optimism in the original films. When I first read it, my jaw dropped. What would make someone that alienated from his original convictions?"

The Perks of Being A Wallflower

The Source: Again, Entertainment Weekly's preview of the next movie

Probability of Accuracy: Pretty accurate, because who knows better how uncool a character is than the actor that plays them?

The Real Deal: Wondering what role newcomer Rose (Kelly Marie Tran) will play in the next Star Wars installment? If Tran's interview with EW is anything to go by, she might just be enough of a fan to help remind the good guys what they're supposed to be doing in the first place. "Poe Dameron is super cool. Finn’s super cool. Even though [Rose] is good at what she does, she’s not known… She’s not cool. She’s this nobody, this background player, which is what makes her interesting. She’s not the best. She’s not royalty. She’s someone who is just like everyone else," Tran said.

Rose she comes into Finn's life at a point where he's questioning whether or not he wants to stay with the Resistance—and according to John Boyega, her influence helps him come to a decision. "It’s now an opportunity for him to be the best he can be. He has to make a decision, and Rose is there to help him make that choice," he teased. Is this some kind of meta-clue to tell us that it's okay to be fans as long as we keep inspiring our heroes to do the right thing? If so, I am here for this.

Take Care Not To Hurt Yourself

The Source: Making it a hat trick, Entertainment Weekly's Last Jedi preview

Probability of Accuracy: The information comes from director Ryan Johnson, so we should all hope it's accurate.

The Real Deal: Turns out that the Porgs aren't going to be the only aliens that Luke Skywalker is sharing the island of Ahch-To with; Star Wars: The Last Jedi writer/director Rian Johnson told EW that he's also going to be dealing with another race called the Caretakers. "They’re kind of these sort of fish-bird type aliens who live on the island," he said. "They’ve been there for thousands of years, and they essentially keep up the structures on the island… They’re all female, and I wanted them to feel like a remote sort of little nunnery." What do they take care of, you might ask? Well, they just might have something to do with the structures on the island, which—if speculation is to be believed—might mean that they have some connection with the Force that we haven't quite seen yet.

Snoke Gets In Your Eyes

The Source: For the fourth and final time, the Last Jedi preview from Entertainment Weekly

Probability of Accuracy: As with almost all things Snoke, this one is entirely open to interpretation…

The Real Deal: With all kinds of speculation abounding about the leader of the First Order, EW added some fuel to the fire by asking The Last Jedi director to talk about what role he does—and doesn't—play in the new movie. "Similar to Rey’s parentage, Snoke is here to serve a function in the story. And a story is not a Wikipedia page," Rian Johnson told the magazine. "For example, in the original trilogy, we didn’t know anything about the Emperor except what Luke knew about him, that he’s the evil guy behind Vader. Then in the prequels, you knew everything about Palpatine because his rise to power was the story."

So, how much of Snoke's story will be revealed in the new movie? Johnson is playing coy, saying only that audiences will "learn exactly as much about Snoke as we need to." (One thing that he would reveal, is that while Andy Serkis' character will indeed be CGI, the actor's motion-capture performance was astonishing: "It’s one of those performances where after every line, I’d look over at whoever’s standing next to me with an expression on my face like, 'Oh, my God, we just got that.'")

Everything's Perfectly All Right Now On The Han Solo Movie. It's Fine.

The Source: Future Lando Mark 2 himself, Donald Glover

Probability of Accuracy: On the one hand, he's only talking about his personal feelings, so it's hard to say whether he's being accurate or not, or even if what he's saying translates to others in the cast. But on the other, this certainly contradicts the official line about how the production is faring…

The Real Deal: Turns out, the changeover between directors on the still-untitled Han Solo movie wasn't quite as smooth as the official party line would have it. In a Hollywood Reporter profile, Donald Glover, who plays Lando Calrissian in the movie, said that Ron Howard replacing original directors Phil Lord and Christopher Miller has shaken his confidence. "Ron is such a legend, and he knows exactly what the vision for what he is doing is … [but Phil and Chris] hired us, so you sort of feel like, 'I know I'm not your first choice …' And you worry about that," Glover says. "I feel like I was the baby in the divorce, or the youngest child. The oldest child is like, 'We know what's happening, but we are keeping you out of it.' And I'm just like, [Glover's voice rises several octaves] 'Was that scene good? How did you feel?'" The question is, will anyone be able to tell the difference between performances in the finished movie, which is still targeted at a May 2018 release?

LEARN MORE

The WIRED Guide to Star Wars

Cassian Andor: Not a Big Fan of Consent, Apparently

The Source: Marvel's Rogue One spin-off comic

Probability of Accuracy: It was part of a canonical comic book story, so it's 100% accurate.

The Real Deal: The secret origins of Rogue One: A Star Wars Story's K-2SO were revealed in a Marvel comic book this past week, and the tale might have raised some unexpected issues. According to the Star Wars: Rogue One—Cassian & K-2SO Special, Cassian Andor reprogrammed the Imperial droid against its will in an attempt to avoid arrest during a mission, prompting at least one website to question whether or not there's an unpleasant rape analogy hiding just under the surface and waiting to be discovered. Well, Rogue One was always intended to be the morally murky installment in the series….

Related Video

Movies & TV

Star Wars Announces Episode VIII in Production

Star Wars Announces Episode VIII

It’s such a simple question Rachael (Sean Young) asks Rick Deckard (Harrison Ford) in Ridley Scott’s 1982 film Blade Runner: “Have you ever retired a human by mistake?” They’ve just met in Eldon Tyrell’s opulent offices, and Deckard, a replicant bounty hunter, has come to interview Rachael as a means of testing the LAPD’s replicant-detecting Voight-Kampff device. Deckard’s equally simple response— “no”—comes without hesitation; he nonchalantly shrugs it off as though he’s never bothered questioning the supposed difference between humans and the androids he’s contracted to kill. The entire exchange takes about five seconds, yet it encapsulates everything that has fueled the public’s decades-long love affair with Blade Runner’s existential dread: What are humans? What myths do they take for granted? What have they been missing?

Over the past 35 years, Blade Runner has (rightly) been lauded for its artistic legacy and chillingly prescient vision. In that time it has also often (rightly) been critiqued for its flaws when it comes to its representations of gender and race. Scott’s film is full of female characters who are all replicants, yet their literal objectification is barely explored; East Asian aesthetics pervade its vision of dystopian LA, yet Asian characters are largely background players; its cyborgs are meant to be stand-ins for oppressed minority groups, but few, if any, minorities are actually present on screen. These shortcomings have become so apparent in the decades since the film's release that Blade Runner has become a shorthand for exploring those topics, even if only to show how sci-fi stories like it can succeed or fail at addressing them. So when word of a sequel arose, the question immediately became whether or not it would update its view of humanity along with its view of the future. The answer to that question, unfortunately, is: not so much.

(Spoiler alert: Minor spoilers for Blade Runner 2049 follow.)

Director Denis Villeneuve’s Blade Runner 2049 is certainly as obsessed with the eroding distinction between human and artificial life as Scott’s film was. Its production design—in 2049, Los Angeles is so overpopulated it looks more like a server farm than an actual human habitat—is just as breathtaking as the original’s. And its technological advances, like the tiny hover-pods that allow Tyrell’s successor Niander Wallace (Jared Leto) to see or the new biologically engineered replicants grown in Matrix-like cocoons, are executed in a way that propels the franchise 30 years into the future. Yet for all the attention paid to updating the sequel’s physical details, its three-hour plot does little to concern itself with anything beyond the depths of its white male protagonists, reducing white women to tired archetypes and utterly sidelining nonwhite characters.

Ford’s reprised Deckard and Ryan Gosling’s blade runner K both have complex inner lives behind their macho reticence. K, like Deckard, doesn't think critically about his job or the replicants he executes. His demeanor remains a mask for the audience to endlessly consider in long, uncut close-up—until a revelation forces him to question his identity, and his world falls apart repeatedly across his face. Deckard describes the heart-wrenching motivations for his self-exile and the agony that has accompanied it; Leto’s Wallace monologues at length about his megalomaniacal ambitions to play god to a species that can overrun humankind. Each man gets a story, and each story gets an airing.

Despite their unrelentingly pedestrian Psych 101 woes, these three men still manage to take up 95 percent of the emotional frame on screen, leaving little room for the women around them to have their own narratives. There’s manic pixie dream girlfriend Joi (Ana de Armas), whom K has literally purchased, à la Her. K’s boss, Lt. Joshi (Robin Wright), berates him at work and then invites herself over, drinks his alcohol, and comes on to him. Mariette (Mackenzie Davis), the sex worker with a heart of gold, repeatedly comes to K’s aid (in every way you can imagine). Wallace’s servant Luv (Sylvia Hoeks) has the most tangible personality, yet she’s obsessed with pleasing Wallace. Even Sean Young’s Rachael makes a cameo as a plot device for Deckard, embodying the final archetype—the martyred Madonna—of this Ultimate Sexist Megazord. Three female characters, not one one of them voicing an ambition or desire that does not pertain to their male counterparts. Just because 2049’s future has females doesn’t mean its future is female.

Yet in a deeply ironic twist, the plot itself hinges entirely on their presence; without women, be they human or replicant, the secret K discovers that sends him on his harrowing mission wouldn’t exist. If Villeneuve and screenwriters Hampton Fancher and Michael Green recognized this, they must have ultimately decided they could accomplish the same goals without having to imbue those critical female characters with the same humanity as their male counterparts. (After all, when a character’s function is more plot point than passion, why bother giving it stage directions?) Several moments almost comment meaningfully on women’s disposability—Wallace’s casual gutting of a newborn female replicant; a giant, naked Joi addressing K blankly from an ad—yet each time, they become sad moments in a man’s narrative, rather than being recognized as tragedies for the women themselves.

Unsurprisingly, the problem worsens when viewed through a racial lens. Gaff (Edward James Olmos, now in a retirement home), a lab tech (played by Wood Harris), and two shady black-market dealers are the only men of color with more than one line; none have identities beyond their use to K. The only visible woman of color is one of Mariette’s nameless coworkers. (While de Armas was born in Cuba, her grandparents are European.) The third act finally delivers a plot twist that insists the story is not actually about K and Deckard—except the action continues to focus on them anyway.

As critic Angelica Jade Bastién recently noted at Vulture, mainstream dystopian sci-fi has always been obsessed with oppression narratives. While it returns over and over again to the downtrodden-rises-up-against-the-subjugator model, the genre has always had a remarkable ability to overlook the persecuted groups—people of color, women, the LGBTQ community, people with disabilities—whose experiences it mines for drama. White creators, men in particular, tend instead to whitewash their casts, imagining themselves as both villain and hero. Rather than simply putting the real thing in the story, their tales become metaphors for the real thing. Blade Runner 2049 falls into this trap: Even as Wallace grandstands about "great societies" being "built on the backs of a disposable workforce," everyone the movie deems powerful or worth exploring is still white and almost 100 percent male, relegating those disposable workforces’ descendants to the story’s incidental margins.

This was one of countless missed opportunities Blade Runner 2049 had to transform the franchise into not just a staggering aesthetic and technological achievement but also an incisive read of 21st century society. In the wake of Mad Max’s recent overhaul—which, while imperfect, managed to redeem many of its predecessors’ flaws—this misstep is especially disappointing. Yet like Deckard’s hurried brush-off of Rachael’s honest question in 2019, 2049’s filmmakers have attempted to tell a story about personhood in 2017 without actually considering the urgent politics that surround who gets to be a person in 2017. And in an era that begs for this kind of reinvention, its failure flattens its message into one more retirement for the books.

Replicant Required Reading

  • Inside the dark future Blade Runner 2049
  • WIRED's Blade Runner 2049 review
  • Director Denis Villeneuve remembers seeing the original Blade Runner for the first time

Related Video

Movies & TV

Blade Runner 2049 Director Denis Villeneuve on Seeing the Original for the First Time

Blade Runner 2049 director Denis Villeneuve reflects on seeing the original Blade Runner on the big screen and why he still loves the voice over version.

You already get a Star Wars movie every year. Star Trek is coming at you from at least two directions. A good chunk of the Marvel movies are basically space opera. Bigscreen fascination with science fiction and fantasy is nothing new—but now you can add the many flavors of TV network, from legacy broadcast to basic cable to streamers. Forget comic books; somehow, SF/F novels have become Hollywood’s hottest IP.

Some highlights of what may be on its way: Amazon is making Neal Stephenson’s Snow Crash and Larry Niven’s Ringworld, and a show based on Terry Pratchett and Neil Gaiman's Good Omens is in production. Universal is doing Hugh Howey’s Sand, Maggie Stiefvater’s Raven series, Kurt Vonnegut Jr.’s Sirens of Titan (with Community genius Dan Harmon at the helm), and Roger Zelazny’s Lord of Light, a.k.a. the movie that the CIA pretended to be making when they rescued American diplomats from Iran—a story that was itself the basis for the movie Argo. Done yet? Nope! HBO is making Nnedi Okorafor’s Who Fears Death? Netflix is making JG Ballard’s Hello America and Daniel Suarez’s Change Agent. And Lionsgate is making Patrick Rothfuss’ Kingkiller Chronicle series.

Phew.

Sure, some of these shows will never make it out of development; others won’t last a full season. Runaway successes like Game of Thrones and The Handmaid’s Tale make it worth the effort; Hollywood is nothing if not willing to keep doing what already worked. Besides, the saga of Westeros is about to wrap up, and then who'll take the reins of fire?

All those various networks and services have offered up plenty of original sci-fi, too, of course. Stranger Things is the obvious standout there. And I’ll do you a solid and point you at Travelers on Netflix, a wickedly clever time travel show from Canada, due for a second season this year.

Some deeper incentives might be behind the raiding of the science-fiction section, though. Familiar intellectual property has two advantages for a TV network. First, it’s already vetted. An editor with experience in science fiction has already made the sign of the IDIC over it and fired it out of a photon torpedo tube. Its characters, its world, and at least the skeleton of its plot live in the fictional universe.

As a consequence, TV makers don't need tea leaves—they can just look at Bookscan and get a sense of how big their built-in audience is. That’s never certain, and the number of people who read Snow Crash, genius though it is, probably isn’t big enough to turn it into a massive hit. But then again, viewership and ratings don’t have the same importance they once did, especially for streaming services. The Amazons, Hulus, and Netflices of the world are more interested in prestige, in must-subscribe programming, and—as a proxy for those other two things—maybe even in Emmys and glowing press coverage. (Quick counterpoint: Syfy’s The Expanse, likewise based on a successful series of books, hasn’t gotten the same attention or audience—which is too bad; it’s cool.)

Plus, genre often ends up being easier to market than "straight" or more literary drama. Take The Man in the High Castle, based on the Philip K. Dick novel. "What if the Nazis won World War II?" Pow. Now try to find a logline like that with, say, Atlanta.

Meanwhile, the larger multiverse of IP sources has become more constrained. The worlds of comic books and big sci-fi franchises are bottled up and decanted among various studios already. (That said, I’m super-psyched to see Greg Rucka and Michael Lark’s Lazarus and Ed Brubaker and Steve Epting’s Velvet comic books, from the creator-owned imprint Image, getting TV development deals, too. Lazarus is political, dystopian sci-fi and Velvet turns a Ms. Moneypenny-like character into a superspy on the run.) Demand for overall television content is high. Demand for genre content is high. The choices are: Either take a risk and make original stuff, or take slightly less of a risk with a known quantity from somewhere. Anywhere.

Movies used to be the only medium that would try to adapt something as ambitious as a book. In sci-fi and fantasy, sometimes that’d give you the Lord of the Rings trilogy (yay!) and sometimes it’d give you The Dark Tower (um). But now that premium television makers have realized that it’s easier to unspool the stuff books are best at—details, set pieces, emotional journeys—in six or 10 or 15 hours than in a mere two. TV’s looking for books; books work better on TV. For the kind of people who roll science fiction books on their Kindle like a development agent rolls calls during casting season, this middle is a very good one to be caught in.

Small-Screen Sci-Fi

  • Charlie Jane Anders on how The Expanse is transforming
    TV.
  • Adam Rogers on CBS' challenge in turning Star Trek into a
    streaming-exclusive prestige
    series.
  • Brian Raftery on The Handmaid's Tale transforming Hulu into a
    prestige-TV
    heavyweight.

Related Video

Movies & TV

Ridley Scott's Top 5 Sci-Fi Films of All Time

"The Martian" director Ridley Scott counts down his favorite sci-fi films of all time.

It's Not Just You. TV Has Hit Peak WTF

March 20, 2019 | Story | No Comments

*Note: This story contains spoilers for the current seasons of *Twin Peaks: The Return and American Gods.

Diane, I'm lost: For the last week and a half, I've been been wrapped in the spastic world of Showtime's Twin Peaks: The Return, the new 18-hour series from gleeful zig-zagger and noted nightmare sommelier David Lynch. Though only four hours in, The Return is already my favorite new series of the year, full of exquisite terrors, concussion-quick humor, and lovingly reintroduced old faces (surely I'm not the only fan who gets near-weepy whenever a Peaks pal like Bobby or the Log Lady wanders back into view after all these years). Here's a show in which Kyle MacLachlan gets to apply his parched comedic skills to, at last count, three different iterations of Dale Cooper; in which an akimbo-limbed tree speaks via a fleshy, bobbing brain-maw-thingee; and in which Michael Cera lisps his way through an impression of a Wild One-era Marlon Brando. Each episode of the new Twin Peaks is a dark hoot of the highest order, and I feel like Mr. Jackpots just for being able to watch them.

Related Stories

I should also mention that, for the most part, I have no idea what is going on with this show. There are a few plot outlines you can pick up with each installment—like all of Lynch's work, The Return has a forward momentum, even if it's not always discernible—but every new sequence in Twin Peaks sires at least a half-dozen questions: What was the space-floating metal contraption that Cooper visited? Who was the oil-slicked, ghost-like figure sitting in that South Dakota jail cell? Will we ever gaze upon Sam and Tracey's mysterious glass box again? (Speaking of which: Rest in Peaks, Sam and Tracey! Unless, of course, you're somehow still alive, which I suppose is very possible at this point!)

These are the kind of queries that might normally send one to the internet, where an ever-teeming team of culture experts—re-cappers, explainers, best-guessers—stand by every Sunday night at 10:01 pm, ready to tell you What Just Happened. In some ways, this kind of forensics-first approach can be traced back directly to Twin Peaks. In the early '90s, when the first series debuted on ABC, it was an instructive bit of deconstructionism: A network series made to be discussed and scrutinized, with Lynch and co-creator Mark Frost inspiring questions about the show's narrative (Who killed Laura Palmer?) as well as the medium itself (If a TV drama can get away with flashbacks and dream sequences, then what else can you get away with?). The modern web didn't exist when Twin Peaks premiered, but those sorts of thoughtful debates both predicted and informed the way we'd eventually talk about TV online.

But in the case of Twin Peaks: The Return, such well-intentioned hunches feel pointless—and, at least for me, totally joyless. Twitter and Facebook have made it possible for us to lob opinions and theories as soon the closing credits roll, but they've also made it tough to become giddily, utterly caught in the grasp of an immersive piece of art. Twin Peaks: The Return has that sort of lingering hold, in part because it's such a goofy marvel to behold, but also because the episodes never come to any sort of definitive conclusion. Instead, they simply float away on the back of a Chromatics synth or some Cactus Brothers harmonies, then remain lodged in your subconscious while you await the next Return. Asking "Wait, what's really going on here?" in the middle of this of ever-rare reverie feels like the exact opposite of curiosity. After all, the more you know what's going to appear in the big glass box—my lazy box-as-TV analogy, not Lynch's—the less face-chewed you'll be by the results.

Such let's-just-see-where-this-goes immersion is also one of the pleasures of the *other * current Sunday-night TV fun house: Starz's American Gods. Unlike Twin Peaks, Bryan Fuller and Michael Green's Gods—which depicts a planet-threatening power-struggle between its titular titans—is theoretically knowable, having been adapted from the 2001 novel by Neil Gaiman. For anyone who, like me, finds themselves occasionally confounded by the show's expansive cast, metaphysics-bending laws of nature, and pop-culture riffage, there are countless *Gods *primers and decoders available online.

Yet it's hard to imagine why anyone would even want to figure out everything that's going on in this gorgeous brain-poker of a show. On Gods, heroes and villains are introduced without hand-holding or hand-wringing; they simply show up and plunge into the action, their greater motivations sometimes not made clear for weeks, if at all. The world of Gods is less surreal than the world of Peaks, but it's just as ravishingly cuckoo: There's a bar-brawling leprechaun and a literally firey-eyed taxi driver; a morph-happy god, played by Gillian Anderson, who inhabits the form of Lucille Ball, Marilyn Monroe, and Aladdin Sane-era Bowie; and a roof-dwelling watcher who, at one point, plucks the moon out of an impossibly perfect nighttime sky. To get an idea of just how strange this show is, consider that one of American Gods' most relatively normal characters is played by Crispin Glover.

And, like Twin Peaks, the stunning-looking Gods casts a small-screen spell that's best left unexamined, in order for it to remain unbroken. The show has a hard-to-pin rhythm—some characters appear only in brief vignettes; others, like Ricky Whittles' Shadow Moon, can take over entire episodes—that allows Green and Fuller to make each episode as narratively elastic as possible. And every scene in Gods is saturated in a daring, beyond-garish visual palette that makes its most out-there moments (like a nighttime desert-sex scene between two fire-consumed, liquid-skinned men) all the more grandly hypnotic. To stray too far away from this version of Gods by wiki-seeking every still-undisclosed detail would be like pinching yourself awake during an especially dazzling dream. For now, the best way to enjoy both Twin Peaks: The Return and American Gods might be to surrender to their utter weirdness, maybe even try to savor it—ideally, with a piece of pie and some damn good coffee.

The question was simple enough: “Where the hell is Kendall Jenner with her Pepsi 12 packs?” As terror enveloped Charlottesville during the “Unite the Right” rally on Saturday, where hundreds of neo-Nazis and Klansmen violently mobilized in a mutinous showing of white pride to challenge the removal of confederate statues, that was all it took to resonate with tens of thousands. Though cloaked in humor, its meaning cut deep, the mockery of Pepsi’s ill-conceived April TV ad distilling how racial and social harmony in America remains something of a joke.

The reaction was neither unexpected nor the exception—on Twitter, the binding of anguish, cynicism, and satire has become a shared lingua franca in the wake of national torment.

I spent the weekend in Cleveland, mostly offline, celebrating a friend's wedding. What news I did consume, as I attempted to make sense of the Charlottesville protest turned melee, came via Twitter: images depicted ghoul-faced white men hoisting torches in medieval defiance, of bodies being launched into the air, and of extremists bludgeoning Deandre Harris, a young black man, as he fell to the concrete. There were people who assembled to honor the life of Heather Heyer, the 32-year-old woman killed during the rally’s fallout. Tweets, too, memorialized the names of individuals like Heyer who were victims of recent incidents grounded in gruesome racial contempt.

In spite of Twitter's capacity to rapidly disseminate news like few other digital rostrums, many of its critics (and users) fear that the social hub has largely become a cesspool of snark and nonsense; “everyone is bad” and “never tweet” are frequent rejoinders to the madness that the platform has been known to spur. In the wake of an occurrence like Charlottesville, one of such ferocity—when deep hatred, out-and-out racism, and death foment in disastrous concert—Twitter becomes a conduit for externalized grief. My feed turned into a tempest of trauma and humor, an emotional battleground yoked by what writer Hilton Als once solemnly described as “a reality I didn’t want to know.”

Rampant denial of America’s past—with its history of slavery, redlining, and mass incarceration— led to tongue-in-cheek tweets about shallow white allyship and President Trump’s equivocating “on many sides” statement. In the aftermath of homegrown extremism or tragedy, a new contagion often emerges, built on innocence and patriotism; stock expressions like “I can’t believe this is happening” or “We are better than this” become a handy, if fictitious, narrative. But on Twitter, such sentiments are relentlessly met with a chorus of ridicule and truth.

There’s also the exhaustion many feel at having to address white ignorance in moments of racial conflict.

The jokes, and their surprisingly codified form, are all part of the Twitter cycle—the news and the grief never outweigh or dilute the humor, and vice versa. And for many of Twitter’s marginalized participants, be they black or gay or female, humor functions as a necessary safeguard. “The absurdity of reality—the only way for many people to deal with that is satire and comedy,” says Charisse L’Pree, who teaches media psychology at Syracuse University.

L’Pree believes microblogging platforms like Tumblr and Twitter have become overwhelmed with people’s desire to share, which often escalates in times of crisis. “This is a process by which we cope with tragedy when there is a very visible white power movement happening,” she says of tweets like Jenkins’. But she is careful to clarify: “It’s a pretty standard reaction; we just see the evidence more because of social media. But the frequency of retweets demonstrates that what this person is saying is actually reaching and resonating with people.”

How we’ve learned to shoulder the terror, in the real world and the world we construct online, is perhaps the most telling. The footage and first-hand accounts out of Charlottesville have been at once crushing and completely unsurprising, a reflection of an America many have endured before and are likely to endure again. For those geographically removed from the mayhem, Twitter acts as a necessary remedy. It is a coarse statement, but a no less true one: humor soothes.

Early Sunday morning, I came across an image on my timeline. The photograph had been taken a month before, at a similar "save the statue" Klan rally, but it couldn't have felt more timely. A black man in jean shorts sits somewhere in the vicinity of the protest, smoking a cigarette. He looks untroubled, calm. Wholly unbothered. In his hand he holds a sign that reads “Fuck yo statue.” I laughed for a second, thinking of Chappelle’s Show’s watershed sketch about funk musician Rick James, and then I did what I always do. I kept scrolling.

The anatomy of a Little Simz song doesn’t offer itself up easily. On the recent “Good For What,” a whir of tough North London bristle, the rapper born Simbiatu Ajikawo contemplates early successes. “Look at young Simbi in Vogue/Look at young Simbi in Forbes,” she says in the video, merrily skating through the streets of Los Angeles. “Well someone’s gotta do it right/Someone’s gotta open doors.” As British rap has found more footing in US markets over the last handful of years, taking a bigger share in the international mainstream—partially owed to a greater cross-cultural exchange among artists themselves—Simz has come to represent how music can best travel among divergent cultures in our increasingly globalized world.

One argument, familiar to anyone privy to the nativism of Donald Trump and his ilk, contends that globalization actually dilutes local cultures. In popularizing the customs of a given community, the thinking goes, these things in some way lose their truer essence. We've seen that argument play out with Drake, whose ardent obsession with UK culture has granted artists like Jorja Smith, Skepta, and Giggs more mainstream visibility in North America. Simz’s rise is, in part, a rebuke to that thinking, taking the independent route with the creation of her own label: she proves that the best conduit can still be the self, even when it’s away from home.

Metaphor or not, “Good For What” finds the young Brit looking back even as she pilots forward, taking solace in the palm-treed environs of Southern California but still every bit the girl who grew up under the metal skies of Islington. The music, all backbone and unflinching emotional lucidity, may have changed locations, but it’s remained unmistakably Simz, a posture that is no mere performance. “I was made for this shit,” she declares, over Astronote’s murky, atmospheric production. And later attests: “Cause this is bigger than you thought/Thought I was finished, let me give you more.”

“More” has never been a problem for Simz. She is a covetous creator, having collaborated with artists like reggae revivalist Chronixx, Rihanna songwriter Bibi Bourelly, and soul experimentalist Iman Omari; toured with the Gorillaz; and dropped 11 projects since 2010 (a blend of albums, mixtapes, and EPs). There’s been no disconnect in Simz’s presence stateside, either. When she initiated one of the several freestyle cyphers at the BET Hip-Hop Awards in mid October, she did so as the only black woman artist hailing from the UK. With a pinch of English cool, the 23-year-old rapper spoke of her adolescence and the tirelessness it took to overcome the likelihood of turning into just another cultural data point. “Who’d thought this would happen/ teachers would tap me funny when I said I’d make it from rapping,” she offered in the minute-long verse.

Though the cypher included brash ascendants like Detroit's Tee Grizzley and Atlanta sing-rap polymath 6lack, Simz held court like a seasoned pro: dynamic and levelheaded, if exceedingly expeditious in her layered delivery. Her authority carries little surprise to anyone who has followed the young rapper's continued climb, gaining traction in the US since issuing her seductively ruminative E.D.G.E. EP on SoundCloud in 2014 (the breakout track “Devour” has since amassed 3.65 million streams).

Still, the most radical element of Simz’s arsenal may be her grandiosity. A song like “Good For What”—with its puffed-up moxie and tales of shrewd diligence—provides another roadway into her appeal by better refining the many avatars she dons so effortlessly, accentuating the social realities of black women. Simz’s sustained output has also allowed her to be even more elastic in her selfhood. There is a vulnerable intensity alive in her work; it satiates but jars the soul, lines so ordinary you forget how much power they hold in one’s own life. “My imperfections make me who I plan to be” she sang on “Doorway + Trust Issues,” from January’s Stillness In Wonderland (the deluxe edition, which features seven new songs including “Good For What,” releases November 4).

The final shot in the video for “Good For What” zeroes in on Little Simz, standing by herself in the middle of a nondescript LA street, the line “Look at me, once again I was made for this shit” looping in the background. The message is unmissable: no matter where she’s at, it’s best we leave the translation up to her.

More From the Author

Let's just get the weird part out of the way: I'm typing these words on an invisible computer. Well, kind of. There's a visible laptop open on the corner of my desk, but only so my (also not invisible) keyboard and mouse can plug into it. But the window containing these actual words I’m writing? That's just hovering in midair, directly in front of my face, equidistant between me and the now-dark, real monitor I usually use at work.

To be honest, though, right now I’m a little more interested in the other window hiding behind this one—the one with last night’s NBA highlights all cued up and ready to help me procrastinate. So I reach out with my hand, grab the work window by its top bar, move it out of the way, and commence watching Victor Oladipo bury the San Antonio Spurs. Even better? Since I’m the only one who can see these windows, my next-desk neighbor doesn’t know exactly what I’m doing. To her (hi, Lauren!), I’m just the idiot sitting there with a space-age visor on, making grabby motions in midair.

This is the vision of “spatial computing,” an infinite workspace made possible by augmented reality. And while my workspace at the moment isn’t quite infinite, it still stretches across a good part of my vision, courtesy of the Meta 2 headset I’m wearing. There’s a window with my email, and another one with Slack, just so I know when it’s time for me to jump in and start editing a different piece. The question is, is the idiot sitting there in his space-age visor able to get all his work done? That’s what I’ve spent the last week trying to figure out. Spoiler alert: he isn’t.

But the experiment also suggests a different, more important question: will the idiot in his visor be able to get all his work done in it someday? That’s the one that has a more hopeful answer.

If virtual reality’s promise was bringing you inside the frame—inside the game, or the movie, or social app, or whatever screen-based world we’ve always experienced at a remove—then AR’s is turning the whole damn world into the frame. The virtual objects you interact with are here now, in your real-life space, existing side-by-side with the non-virtual ones. While at the moment we’re mostly doing that through our phones, we’re on the cusp of a wave of AR headsets that will seek to turn those pocket AR experiences into more persistent ones.

Meta 2 is one of those headsets; at $1495, it poses an interesting threat to the far more expensive Microsoft Hololens, as well as the who-knows-when-it’s-coming Magic Leap headset. (Despite the three using differing marketing language—"augmented," "mixed," "holographic"—they all basically do the same thing.) It’s still a developer kit, though Meta employees are quick to tell you that they use theirs every day at work. But while lots of non-employees have gotten a chance to see what the Meta 2 can do in the confines of proctored demonstrations, not many outside the developer community have had the luxury of an extended multi-day visit with the thing. I have. And I’ve got the enduring red forehead mark to show for it.

This isn’t a product review, so I’m not going to take you through the specs of the thing. Here’s what you need to know: Its field of view is significantly larger than the Hololens (which can sometimes feel like I’m bobbing my head around looking for the sweet spot that lets me see the virtual objects fully), and its lack of a pixel-based display—there are twin LCD panels, but they reflect off the inside of the visor—means that visuals appear far sharper at close range than VR users might be used to. Text is more readable, images more clear. In theory, it’s perfect for the kind of work I do as a writer and editor.

The Meta 2 uses an array of outward-facing sensors and cameras to map your physical surroundings, and then use it as a backdrop for everything you do in the headset. That means that if you push a window allll the way behind, say, your computer monitor, it should effectively disappear, occluded by the real-world object. The key here is should: like many of the Meta’s most interesting features, it’s inconsistent at best. The mouse pointer would sometimes simply disappear, never to return; until the company pushed a software update, the headset refused to acknowledge my hand if I was wearing a watch; it wasn’t uncommon for the headset to stop tracking me altogether.

The headset’s software interface, called Workspace, is a bookshelf of sorts, populated by small bubbles. Each represents a Chrome-based browser window (albeit a minimal rendition, stripped of familiar toolbar design) or a proof-of-concept demo experience—and maybe soon, third-party apps. To launch them, you reach out your hand, closing your fist around it, and dragg it into free space. (Hand selection was an issue throughout my time with the headset; if I wanted a no-second-takes-necessary experience, I generally opted for a mouse.) There’s a globe, a few anatomical models, a sort of mid-air theremin you can make tones on by plucking it with your fingers, and…not much else. That’s not necessarily a concern; this may look and feel like a consumer product, but its only real purpose is to get people building apps and software for it.

But as a writer and editor who was ostensibly using it to replace his existing setup, I simply didn’t have the tools for the job. Meta’s current browser is based on an outdated version of Chrome, meaning that using Google Drive was out—both for writing and for syncing with a any other web-based text editor. The headset allows a full “desktop” view of your computer, but anything you open in that view takes a big hit in clarity; editing in Word, or even in a “real” web browser, wasn’t worth the eyestrain. Did I enjoy having a bunch of windows open, and moving them around on a whim? Of course. Did I like the fact that I could do my work—or not—without prying eyes knowing I was agonizing over yet another sneaker purchase? God, yes. But for day-to-day work, the "pro" column wasn't nearly as populated as the "con."

Every company working in this space rightfully believes in the technology’s promise. Meta even partnered with Nike, Dell, and a company called Ultrahaptics, which uses sound to create tactile sensations (yes, really), to create a vision of the future that makes Magic Leap’s promotional pyrotechnics look like a used-car commercial.

But this isn’t just augmented reality; it’s not reality at all. At least not yet. Certainly, augmented and mixed reality is well-suited to fields like architecture and design; being able to manipulate a virtual object with your hands, while still sitting or standing with colleagues in the real world, could very well revolutionize how some of us do our jobs. But for now, most of AR’s professional promise is just that. Even a diehard Mac user can get used to a Windows machine, but until object manipulation is rock-solid, until the headset is all-day comfortable, and until there’s a suite of creative tools made expressly for AR, rather than just seeking out web-based workarounds that may or may not work, then for now it’s simply a fun toy—or at least a shortcut to looking like a weirdo in the office.

In a couple of years' time, though? That's another story. As with VR before it, the AR horse left the barn ages ago; there's so much money flowing into it, so much research flowing into it, that significant improvement is only a matter of time—and not much time at that. So don't take my problems with a developer kit as a doomsday prophecy; think of it like a wish list. And right now, I just wish it could be what I know it will.

Related Video

Culture

AR, VR, MR: Making Sense of Magic Leap and the Future of Reality

The age of virtual realty is here but augmented reality and its cousin mixed reality are making strides. WIRED senior editor Peter Rubin breaks down the new platforms.

Edgar Wright's Baby Driver begins the way most capers end: three goons pulling off a bank heist, then their getaway driver leaving the cops in his rearview. Unlike most capers, though, the escapade goes down to the pulse-pounding strains of The Jon Spencer Blues Explosion's "Bellbottoms." It’s a whiplash-inducing rush that lays more than a dozen cars to waste and doesn’t let up for the song’s entire 5:16 runtime, each screeching turn and crash perfectly timed to the track's churning rhythm. And it ought to be—Wright’s been plotting it ever since he heard "Bellbottoms" in 1995. “That moment was the closest I’ve ever come to synesthesia,” the writer-director says. “I would listen to that song and start visualizing a car chase.” As a result, he made a movie perfect for a song, instead of finding a song perfect for his movie.

What’s incredible, though, is that when he dreamed up Baby Driver—arguably the first film to make the iPod a central character—most people were still making each other mixtapes. Back when “Bellbottoms” was released, folks couldn’t dance down the street with 3,000 songs at their disposal. Even with the 15 you could cram onto a single side of a cassette, cuing up the right track at the right time was just about impossible. Portable CD players helped deliver music faster and more accurately, but they skipped constantly if you tried to walk with them having anything more than the slightest spring in your step. iPods and their non-Apple mp3-playing ilk changed all that, allowing hundreds of hours of music to be stuffed in your pocket ready to be cued up during just the right moment.

“For the first time, which wasn’t the case with the Walkman or the Discman, the iPod meant people could basically start soundtracking their own lives,” says Wright.

That’s what getaway driver Baby does throughout the movie. Looking to fulfill a debt to the crime boss Doc (Kevin Spacey), he times every heist to a specific song. A sufferer of tinnitus, he needs the music focus on the road—to drown out the hum in his ears and the chaos around him. That leads to some exquisitely crafted car chases, but it also leads to some moments more relatable to folks who don’t know how to execute proper donuts. Like, for example, the time Baby goes on a coffee run set entirely to Bob & Earl’s “Harlem Shuffle.” We've all had that moment, all known instinctively based on the weather, mood, or activity just what song to cue up, then timed every footstep or lane change to that said song. For music fans, getting it right feels like high-fiving a million angels. For a director like Wright, making a movie that way is downright genius.

Related Stories

Lots of directors make movies with soundtracks in mind—and often have songs in mind when they develop their films. Quentin Tarantino and Cameron Crowe are both known for this, but Wright took it a step further, timing scenes to the songs he knew he was going to use. Instead of syncing the action to a track as the movie was being edited, it was shot to be beat-for-beat—just like we all do when we time our morning run to Beyoncé’s “Ring the Alarm” (or, you know, whatever is on your sprint mix).

“When life starts to sync up with your soundtrack, it’s a magical moment," Wright says. "If it’s something where you’re walking and it’s cloudy and the sun comes out in time with a bit in the song you feel like you’re omnipotent—so Baby Driver is an entire movie made up of moments like that.”

As a result, Baby Driver feels like a rollicking action movie that just happens to plays like the fantasy everyone has experienced when the bassline of their favorite song times perfectly to their footfall—it’s just that when Baby’s foot falls its dropping the pedal to the floor.

The next time you watch George A. Romero’s classic 1968 creepshow Night of the Living Dead, do your best not to look away. It won’t be easy, as the zombie-zeitgeist-defining shocker—filmed in stark black-and-white, and populated with terrifyingly dead-eyed human-hunters—still has the power to unnerve, nearly 50 years after its release. But take a closer look, and you might get a sense of just how much low-budget derring-do and luck was involved in making one of the most epochal horror films of all time. “There’s a copy of the script visible in one of the frames!” Romero told the New York Times last year. “I won’t tell where. It will be a little challenge for fans to spot it.”

Romero, who died Sunday at the age of 77 after a brief bout with lung cancer, directed several smart-schlock joy-rides during his decades-long stint as a director and writer, including 1973’s bio-shock thriller The Crazies, the 1978 blood-sucking drama Martin, and 1982’s lovingly yucky comics adaptation Creepshow. But his long-running career was always better off dead, thanks to a series of socially minded zombie movies that began with Night—which Romero and a bunch of friends shot in rural Pennsylvania at a reported cost of just $114,000. (The production was so commando that, at one point, a member of the film’s production team borrowed his mother’s car to shoot a scene, and wound up smashing the windshield; he repaired the damage before she could find out).

Romero didn’t invent the zombie movie, but he did reanimate it, using cheap-but-effective effects, patient camerawork, and amateur actors to give the movie an almost documentary-like urgency. As a result, Night went on to earn millions at drive-ins and college theaters across the country, making it one of the biggest independent smashes of the century, and a clear influence on everything from 28 Days Later to World War Z to The Walking Dead.

Related Stories

Still, it wasn’t just the movie’s eerily determined flesh-eaters that made Night of the Living Dead a hit; it was the suffocating on- and off-screen mood it captured. Released during one of the most divisive and paranoia-prone years of the ’60s, Night climaxed with a harrowing finale, in which an African-American survivor (played by Duane Jones) survives a bloody night of zombie-fighting—only to be shot dead by a white gunman. It was a savage bit of social commentary snuck into a midnight movie, and though Romero maintained that the movie wasn’t supposed to be a racial allegory, Night nonetheless proved that horror films were an ideal vessel with which to examine the nightmares of our real world. As Get Out writer-director Jordan Peele noted earlier this year, “the way [Night of the Living Dead] handles race is so essential to what makes it great.”

Romero would make five follow-ups to his shambling breakout, and though 1985’s barf-cajoling Day of the Dead and 2005’s Land of the Dead were both gross, groovy B-movie fun, his greatest work was 1978’s Dawn of the Dead, which followed a team of survivors as they hid out in a zombie-infested shopping mall. Dawn’s anti-consumerism message was a hoot, and the movie could have functioned as pure camp alone—except it remains downright terrifying, full of anatomically detailed gore (all hail gross-out king Tom Savini!), believably desperate heroes (and baddies), and hordes of lurching, formerly luxury-seeking zombies whose dead-eyed lust for more, more, more! seemed awfully human. Dawn comes to mind every time I drive by an empty shopping center, or gaze at a dead-mall photo gallery, and it proved that few other filmmakers understood our base desires—and their ruinous effects—quite as well as Romero. "When there's no more room in hell,” a character says in the movie’s most famous quote, “the dead will walk the Earth." Thanks to Romero, we all got to experience that hell from a safe, scared-brainless distance.

Related Video

Movies & TV

World War Z: Building a Better Zombie Effects Exclusive

WIRED's exclusive behind the scenes look at the making of "World War Z" reveals how the visual effects artists at MPC used massive crowd simulations and hand animation to create the devastating swarming of Jerusalem by a zombie horde.