Tag Archive : CULTURE

/ CULTURE

More than a decade later, the thing I most remember about Okami is how color follows you wherever you go. Released in 2006, by the now-defunct Clover Studios, the game starred a wolf-god named Amaterasu in a vibrant world inspired by Japanese ink wash painting. The folkoric Japanese landscape Ameratsu finds herself in, though, is dying—empty and colorless. The eight-headed demon Orochi has been unsealed to wreak havoc, and in doing so he has turned everything literally black and white; the world is effectively a painting with its hues all gone.

That color comes flooding back when you help the people of Japan fight Orochi. It bursts forth from Amaterasu—an incarnation of the Shinto goddess of the sun, an avatar of life and light—and fills the landscape outward. Flowers erupt from the ground. Okami's pastoral landscape sings and becomes new with each victory, each step made against the malingering darkness. It's as potent an image of renewal and redemption as I've ever seen, one of the only moments in any game to stir the religious parts of me.

Now, after a lengthy absence, Okami itself has been renewed, updated to run on modern consoles and the PC. For a game that only sold around 200,000 copies at launch, released only months before the studio that made it collapsed, it's a well-deserved resurrection. Okami deserves a place in the modern landscape; however, its return is more than a commercial boon. Meditative and warm, dedicated to an uncomplicated belief in the beauty of the natural world and the power of people to make the world better, the game is a balm—an emotional corrective in a time of upheaval.

In keeping with the sumi-e art style, Amaterasu herself is an artist, wielding a magic paintbrush with the help of a tiny helper named Issun. Her magic ink gives her direct authorial sway over the world itself—a sweeping brush stroke might create a mighty wind, while a wide circle might be a means of pushing life back into the dead world. Playing Okami is a surprisingly creative endeavor, using Amaterasu's powers to not only redeem but transform the world, opening up passages, slaying demons, and calling the powers of nature itself to your aid.

The creators at Clover placed this ideas within the familiar structure of a 3D The Legend of Zelda-style game. Okami has a narrow open world to navigate and various constrained encounters and dungeons gating off parts of that world, creating, like in a Zelda game, a slow sense of progress and a broad sense of adventure. Amaterasu plays the role of a vagabond god, setting right what once went wrong, occasionally pausing along the way to indulge her inner wolf and howl at the moon or dig for a buried treat.

Okami's biggest weakness is that it's, perhaps, too long, hedging its bets too far toward a traditional single-player game experience and pushing past the point where its distinct mechanics start to lose their mystic luster. But at the same time, every moment in this world feels special. The creators at Clover Studio, which included Shinji Mikami of Resident Evil fame, Hideki Kamiya, and a bevy of other minds who would go on to create Platinum Games, put everything they had into Okami. They built an underrated masterpiece, the kind of beautiful work that's critically acclaimed but forgotten all too quickly.

Now, good fortune and the good sense of its publishers at Capcom have made Okami more widely available than it's ever been. Amaterasu will be there waiting, her white tail bushy and wild in the wind, eager to lead the way into a world worth saving.

If you asked me to list the ways I thought Elton John might one day announce his retirement from touring, "a splashy, CGI-filled VR retrospective" would have been nowhere near the top. Maybe in the low teens. Maybe. Yet, that's exactly what he did earlier today—and a few a weeks before that, it's exactly what I'm experiencing in a small, dark room outside LA.

That's where I am in corporeal form, at least. Inside the VR headset I'm wearing, I'm in a different small, dark room in southern California: West Hollywood's iconic Troubadour nightclub in 1970, peering into the bespectacled, CGI-ed face of a 23-year-old Elton John while he sings “Your Song.” It’s a recreation of his first US concert, the one that catapulted him to global fame, and as I glide weightlessly around his piano, thousands of golden specks—metaphorical stardust, presumably—fall from the ceiling, swirling around the demure young man at the piano.

Then the scene changes, and so does Elton John. Now I’m onstage in front of a packed Dodgers Stadium during one of the musician's two 1975 shows, and a very sequiny John is pinballing around the stage screaming out “Saturday Night’s Alright For Fighting.” It is, I admit, a little overwhelming. At one point his face swings so close, and has such believable dimension, that I take a big step backward—right into a bundle of what I assume are some very important cords.

“Let me get you a chair for this part,” says Ben Casey, founder and CEO of Spinifex Group, the creative studio/digital agency/production company behind this extravaganza. I take a seat, and just in time—the ground falls away and I zoom into outer space, float around in Elton John’s cocaine party of a private jet and some lava-lamp looking nebulas, and traverse a whirling yellow brick road back down to Earth. All the while, images from The Lion King and Gnomeo and Juliet flash by, along with other visions of John throughout his career.

When the intergalactic acid trip ends and I emerge from my headset—a little dizzy, completely overstimulated, with “Rocket Man” firmly stuck in my head—that little corner of Spinifex’s offices seems even darker and smaller and grayer. Clearly, I was wrong: Turns out a VR experience like Farewell Yellow Brick Road: The Legacy is the most Elton John announcement anybody could have hoped for.

The Elton Factor

And of course it is. The very existence of this event—the VR experience that was just simul-blasted to headset-wearing audiences at events in New York, Los Angeles, and London, followed by a concert and Q&A livestreamed by YouTube to fans around the world, all to announce his upcoming final tour—is a testament to the creative clout and staggering influence wielded by John and his team. “Working with these guys felt very much like rocking up to palaces and talking to rulers of the Emirates or officials in China,” Casey says. (Both are things he's actually done.) “There’s this assumption that they’re going to do the next big thing.”

Once Spinifex sold John's team on their vision (a nearly six minute VR piece that encapsulated John's career followed by a live performance and Q&A, all with global reach, all somehow accomplished without overheating any of the large audience's headsets) that vision became the watchword—naysayers be damned. Imagine being told by Google, as Team Elton was, that what you were asking for was, if not technically impossible, unbelievably challenging. Then imagine convincing Google they were wrong about the limitations of their own technology, pretty much just because you say so. This is the world Elton John lives in.

And once the collaborators determined how audiences would be watching, there was the whole question of what, exactly, they’d be looking at. And how, exactly, they could make it look good. “We have all these darlings of the VFX world in, people who’ve done Deadpool and other big, cutting-edge visual effects productions,” Casey says. “And they genuinely reached a point in this process where all they could say is ‘That hasn’t been thought about yet.’”

The novel challenges were these: In order to chart the arc of his career, they had to create believable versions of Elton John at various different ages and recreate scenes from the 1970s with extremely limited (and lo-res) reference materials. Both of which require a good bit of help from a very busy and somewhat cantankerous musical icon.

Capturing John’s present-day incarnation is easy, because that guy exists. But the Elton John who played at the Troubadour? Five grainy images are seemingly the only document left. So Spinifex had to sculpt their recreations of young Elton based on the scant resources they had—as well as a CGI-youthified face they modeled off scans of the 70-year-old star. Fortunately, John’s signature oversized glasses make that task a bit easier.

But there's more to a classic Elton John experience than a digitally de-aged face. Spinifex couldn't just create a generic CGI doppleganger: the character had to perform and play like Elton John. Asking the singer to reprise his 1970s antics—running around stages, kicking his foot up on top of a piano and strumming his thigh like a guitar—seemed a bit much. “When we realized we were going to need a body double, Elton just said ‘Well obviously you’d use Justin Timberlake,’” says Casey, laughing. They ended up hiring Russ Anderson, a professional Elton John impersonator, for the high-energy sequences, and replacing his face after the fact with a de-aged version of John’s actual face.

But when it came to recreating John’s signature style of piano playing, they were determined to capture the genuine article. Spinifex brought in motion-capture pros from nearby animation studio House of Moves, and John squeezed into a mo-cap suit and took a seat behind the electric keyboard amidst a forest of cameras. There was even a specialist “Dot Doctor” charged with positioning (and repositioning) motion-capture tracking markers on John’s fingers. The day went smoothly—mostly. “He’d been playing ‘Tiny Dancer’ and kept hitting this note and stopping," Casey says. "And he’d be like ‘Can you fucking hear that? It’s ghosting! Look at all this technology around us, and it's a keyboard that isn't working.’ No one else could even hear it.” Things got a lot better when they rolled out a real piano.

Hack City

Getting fans immersed in the magical-realist world of Elton John also takes a whole lot of bleeding-edge tech. For the last six months, Spinifex has had to cobble together wildly disparate pieces of technology—much of it uncomfortably new—to have even a prayer of pulling it off. Making things look seamless in VR is hard enough. Doing it while also creating live-action sequences … that need to look like they're set in the past … in stereoscopic 360 format? That's almost ridiculously complex.

To keep the experience stable and comfortable for viewers, and to make sure the Elton stand-in's replaced face didn't go all Exorcist, Spinifex commissioned a bespoke head for a motion-control rig—itself a fancy piece of tech used in the filming of Thor: Ragnarok. And to make sure VR Elton stays in focus, the studio used Facebook’s “cube map” format to concentrate what spare pixels they have on a single hot spot—in this case, Elton. It’s a similar idea to Google’s recently-announced VR180 cameras, which captured the just-completed live event that followed Spinifex’s VR experience.

“We breathed a sigh of relief when we realized would have them ready. Otherwise we wouldn’t have been able to broadcast this,” says Matt Apfel, Google’s VR video programming head. “Pixels aren’t wasted being wrapped all the way around, and 180 eliminates audience confusion about where they should be looking. We didn’t want that feeling of FOMO.”

In fact, despite all the technical and VR hurdles—face replacement, mo-cap wonkiness, resolution hurdles—the biggest technical challenge turned out to be the logistics of the live event. To be effective, Spinifex's VR piece had to begin simultaneously for the hundreds of people actually attending Elton's event (and, to a lesser extent, the onlookers at home), with all those VR headsets somehow triggering within a hundredth of a millisecond without crashing the WiFi network. "Gladly, and nervously, no one has ever done this at this scale," says Shea Clayton, Spinifex's head of interactive. Since off-the-shelf triggers were out of the question, they borrowed a technique for equalizing traffic across a network of phones from the gaming world, and in case of emergency, send what Clayton calls a "last will and testament:" If my connection drops out, trigger this content at this specific time. And, bizarrely, the scrap of code they're using to make sure those messages get sent and received was invented for an oil pipeline. "Borrowing bits and bobs from other things is really what makes it work," Clayton says.

Results

Did it all come together? At the moment I write this, I have no idea. It worked in Spinifex's offices, where I watched about 100 phones pulse in unison and start blasting Elton John. But a better question for a piece that's meant to sum up an icon's career is, is it effective? And for those closest to John, the answer seems to be yes. "When I saw everything finally, fully finished, I couldn’t stop crying," says David Furnish, Elton John's husband and CEO of Rocket Entertainment. "I know I’m a sample of one. But while my closeness to the subject means I'm easily moved, it also mean that the bar is super high."

And while its conceit—incomparably influential artist goes out with a futuristic bang—could have come off as contrived, the idea that Elton John is an artist capable of moving at the speed of culture doesn't seem as far-fetched as it might for other musicians of his era. He's built a career on invention and reinvention. And now that his likeness, his performances, and his music have been captured in various high-fidelity formats, there's no reason that the music has to stop just because the touring does. "He wouldn’t want a computer to write a song, but anything that respectfully keeps his songs and catalogue alive, to new audiences and different audiences, that surprises and delights and entertains them?" says Furnish. "Elton is one hundred percent in favor of that."

Today, it's a VR experience. Tomorrow, you might be rubbing elbows at the piano with a holographic Elton John.

Related Video

Culture

How a Virtual Reality Journalist Takes Viewers Inside Stories

At Nonny de la Pena's Los Angeles-based Emblematic Group real stories, from melting glaciers to solitary confinement, are made in full virtual reality.

Lackluster New It Doesn't Clown Around

March 20, 2019 | Story | No Comments

For a while there, It floats along nicely. Adapted from Stephen King's famously elephantine blockbuster, which pits a group of awkward Maine teens against a shape-shifting monster, the first stretch of this town-and-clown horror thriller makes for appropriately goony fun. That's partly because, for viewers of a certain age (ahem), It functions as an effective big-screen time machine: Unlike King's book—which was adapted in the early 1990s as a hokey, if sincere, TV miniseries—this big-screen version takes place in the summer of 1989, a period of ample pop pleasures (Lethal Weapon 2, New Kids on the Block) and zero bike-helmet laws. It captures the low-key latchkey existence of the Reagan-Bush summers so accurately you can almost smell the Big League Chew in the air.

Related Stories

But the kiddos in It don't have much time to enjoy their adolescent freedom, as they're facing a very grown-up terror in Pennywise (Bill Skarsgård), a balloon-toting clown with a malleable maw featuring rows of quill-like sharp teeth and a wild (and wildly receding) hairline that even Nicolas Cage would deem a bit much. Pennywise's introduction is one of the most nightmare-fostering moments of King's career, and director Andy Muschietti (Mama) retains its heartbreaking brutality: One rain-soaked afternoon, a young boy named Georgie watches as his paper boat is sent down a drain. He's subsequently charmed, chomped, and abducted by Pennywise, in a literally cold open that, by modern-horror measures, is remarkably restrained while still psyche-scarring.

Georgie's disappearance consumes his older brother, Bill (Midnight Special's Jaeden Lieberher), who turns to his friends for help—only to learn they've all been experiencing their own private terrors, whether it be in the form of a skin-littering leper, a high-speed headless corpse, or a twisted painting come to life. Muschietti stages these scenes with a patient, take-it-all-in visual sense, as well as a keen grasp of nightmare logic (in one of the kid’s visions, Pennywise twists on a meathook in the dark distance, his eyes glowing menacingly). They could run to their folks, but, rather conveniently, the adults of It are mostly total creeps, none of whom seem to care much about the alarming number of children vanishing from their town. Maybe that's just sloppy storytelling. Or maybe that bard named Will was right: Parents just don't understand.

The thrills of It lie in these early, almost flirting terrors, which unite both the kids (who dub themselves the Losers' Club) and the audience in the terrifying possibility of what greater horrors await. It helps, of course, that we root for these Losers, a collection of spazzes, nerds, and outcasts who are subjected to every teen trauma imaginable, from knife-wielding bullies to signature-free yearbook pages. The film's most touching moment is also its nastiest: After the group's lone teen girl, Beverly (Sophia Lillis), is doused with gallons upon gallons of sink-borne blood—a Carrie-echoing gross-out moment with little metaphorical wiggle room—the rest of the kids show up to help her mop up the mess. It makes for an admirably on-the-nose '80s-movie montage, complete with a Cure song in the background.

Yet once the Losers decide to challenge Pennywise on his own turf—confronting him in a haunted house and, later, a debris-strewn underground lair—It begins to deflate. As soon as the kids begin to scatter, you realize there's simply too many of them to keep track of, much less care about—so much so, you almost wish the filmmakers had been heartless enough to thin out their ranks a bit (the cowriting credits for It include True Detective and Beasts of No Nation director Cary Fukunaga, who left the production during development). And the breezy-ish kids-on-the-loose vibe, while fun at first, soon becomes an excuse for repetitive dick jokes and way too many "holy shit!" punchlines. Was such behavior typical for teen boys in the late '80s? Yep. Is it a letdown when a character like Richie (played by Stranger Things' Finn Wolfhard) begins It as "kid who curses" and ends as "kid who curses a lot"? Holy crap, yes.

It's biggest unsolved problem, though, may be its lone marquee name—Pennywise the Clown, who, save for one fun midfilm sequence, just barely retains the creepiness of that initial rain-soaked intro. Instead, Muschietti relies on now-tiring shock-schlock tropes (lullaby-like kids' music; swift, zig-zagging runs straight toward the camera) that turn Pennywise into just another movie monster, one who's ultimately unknowable. There's plenty of theorizing in It about the nature of Pennywise—and the nature of evil—but no definitive answers. That's highlighted by the film's final 15 minutes, which consist of a rapidly edited (yet still utterly rhythmless) CGI showdown in which Pennywise shifts from monster to monster, sometimes for just seconds at a time. It's a shell-game distraction, one that mistakes evasiveness for ambiguity. And it once again unites the kids and the audience, albeit this time in confusion: Just who is this clown, exactly? Maybe we're supposed to wait for the sequel to find out, but I prefer my '80s monsters to have some greater sense of purpose. Then again, as kids used to say back in those days, maybe that's just my prerogative.

Related Video

Movies & TV

Guillermo del Toro's Top 5 Horror Films

The "Crimson Peak" director lists his favorite scary movies of all time and talks about his own encounters with ghosts.

Ana Lily Amirpour became a celebrated filmmaker her first time out. Her debut feature, 2014's black-and-white Iranian vampire flick A Girl Walks Home Alone at Night, got her more than Sundance buzz—it got her a deal to make a second feature with Megan Ellison’s Annapurna Pictures and Vice. That film, The Bad Batch, hits theaters this weekend. It’s … weird. Keanu Reeves plays a new-age messiah who gives monologues about poop; an unrecognizable Jim Carrey shows up as a hermit who says nothing; Jason Momoa, future Aquaman, features prominently as a cannibal. The dystopian romance ain't, as the saying goes, for everybody.

Not too surprisingly, the movie's critical response has borne that out. Whereas A Girl Walks scored an impressive 95 percent on Rotten Tomatoes, Amirpour’s follow-up is only currently pulling 45 percent. Some praise its idiosyncratic vision, others decry its lack of coherence or substance. But Jessica Kiang, writing for The Playlist, nailed what might be The Bad Batch’s biggest shortcoming. “The perils,” she writes, “of the broader-canvas follow-up to the sleek and economical indie debut are writ large: This is Difficult Second Album: The Movie.”

Related Stories

The sophomore slump has always been the worst kind of self-fulfilling prophecy. Any artist who achieves first-timer success invariably finds themselves hamstrung by a creative paradox: Their second effort gets more resources, sure, but also more scrutiny and expectation—and a lot less anonymity. (That added weight is often compounded for women and artists of color, who are much less likely to get a third or fourth chance.) Some filmmakers use their newfound capital to direct a big blockbuster, though that endeavor can pay off (Gareth Edwards' Godzilla) or flop (Josh Trank’s Fantastic Four) in equal measure. Others, like Amirpour, take the opportunity to indulge their weirdest impulses, like a movie about a young woman wandering the Texas wasteland who falls in with a group of cannibals.

Whatever the outcome, it’s essential that directors like her get to indulge their weirdest cinematic fantasies—even if they’re not for everyone. Indulging strange impulses every so often can prove highly beneficial. After Steven Soderbergh won the Palme d’Or at the Cannes Film Festival for sex, lies, and videotape, he made the literally Kafka-esque film Kafka. It was black-and-white and batty, but a few years later Soderbergh was lining up smart crowd-pleasers like Out of Sight and Ocean’s Eleven. Shane Carruth followed up Primer with Upstream Color. Spike Jonze cranked the meta-volume of Being John Malkovich past 11 with Adaptation; Diablo Cody, who became a critical darling after writing Juno, followed it up with Jennifer’s Body, which turned Megan Fox into a blog-speaking succubus. Not all of these efforts were praised, but they all proved to be turning points that helped their creators figure out where they would ultimately go—visionary or visualist, populist or pop-art. (Amirpour won’t divulge what her next movie is about, at least not concretely: “for as much as Bad Batch let me explore some of the shittiest things about us people, the next one lets me look at some of the better things, some of the things that really inspire me about how good we can be. Sometimes.”)

Amirpour isn't the only director dealing with this phenomenon right now. Just ask Colin Trevorrow, whose Sundance-hit debut Safety Not Guaranteed snared him the director's chair on Jurassic World—and subsequently Star Wars: Episode IX. Last week, his film The Book of Henry hit theaters, and thudded its way to a 23 percent on Rotten Tomatoes; even usually-forgiving Rolling Stone scribe Peter Travers called it a “mess of conflicting ideas.” The fallout was immediate and alarmist, with Vulture even questioning if Henry would put Trevorrow’s Star Wars job in jeopardy.

It wouldn't, obviously, but it still led to some soul-searching for Trevorrow. As the critical takes started rolling in, he did an interview with the Empire Podcast wherein he called the bad reviews “heartbreaking,” but also acknowledged he’s under a microscope now as the guy with his hands on both the Jurassic Park and Star Wars movie franchises. "What I may have underestimated is how my visibility as somebody who is responsible for two things that we all care about deeply, and are massive parts of our public consciousness and shared mythology—how that level of visibility would shine a spotlight that I hadn't considered," he said.

The sophomore slump, then—well, in this case, the junior jag—can be valuable not only for creative catharsis, but for learning how to handle the public’s perception of your work. Richard Kelly still seems haunted by the reaction Southland Tales, his post-apocalyptic follow-up to Donnie Darko received. Neill Blomkamp has said he was put in “a very strange place” by Chappie's poor reception, even though the movie “crystallized or congealed ideas in my head in a good way.”

Amirpour, too, is experiencing that feedback loop—and for her, it goes deeper than audiences not just understanding her movie. During a Q&A at a recent Chicago screening of The Bad Batch, an audience member asked Amirpour what message she was trying to convey by having black characters die gruesome deaths in the film. The director responded, “I don’t make a film to tell you a message.” The exchange, and a series of subsequent Twitter threads, showed Amirpour is still learning how to contend with criticism. “I could have a conversation with people, but if someone’s hurling insults at you, let me just say, at the end of the day, I have feelings. You’re going to call me a racist or something, you think I’m not going to have feelings?” she says when asked about the exchange. “I don’t know what to say other than maybe Twitter is not a good place for me.”

For all of the internet's power to rehabilitate the image of once-overlooked pop culture, it's not so kind to esoteric new releases. Social media acts as an instant funhouse mirror for movies like The Bad Batch or Book of Henry, reflecting multiple versions back to their creators—some kind, some grotesque. Which of those depictions people will remember is based entirely on what their creators do next; to treat early experimentation as failure, though, dooms a movie's legacy before its influence has a chance to manifest.

Related Video

Culture

The Director of ‘Jurassic World’ on Tackling the Beloved Franchise

When Colin Trevorrow saw Jurassic Park as a teenager it sparked his interest in the power of film. He spoke with WIRED about taking on the franchise and getting the audience to cheer for their favorite dinosaurs.

Beginning to think that, post-D23 and San Diego Comic-Con, we wouldn't get any new information about Star Wars: The Last Jedi until it hits theaters this December? Then you hadn't considered the importance of publishing realities, with Entertainment Weekly dropping all kinds of fact bombs about the next installment of the saga from a galaxy far, far away. Meanwhile, Lando Calrissian is causing trouble, and the backstory of Rogue One turns out to raise an ethical conundrum that few people had really thought about before. Thank you for tuning into the latest update on the HoloNet, and please remember to tip your Bothan.

Never Meet Your Heroes

The Source: Entertainment Weekly's massive Last Jedi preview

Probability of Accuracy: Consider this one more of an intentionally vague teaser than an accurate piece of information. But what a tease…!

The Real Deal: For those expecting the Rey/Luke meeting in Star Wars: The Last Jedi to be a reprise of Luke's meeting with Yoda in The Empire Strikes Back, then prepare to be disappointed. An Entertainment Weekly story—one of many this time around, considering they had a lot of spoiler-filled previews for the new movie—revealed that Rey finds Luke when his faith in the Force is at the lowest anyone has ever seen. Daisy Ridley described her character's response as, "Oh my God, this other man that I lost within a couple days was somewhat of a father figure. Now he’s gone, and instead I’m with this grumpy guy on an island who doesn’t want me here."

As for Mark Hamill, he seems as if he's trying to come to terms with what's happened himself. "The fact that Luke says, 'I only know one truth. It’s time for the Jedi to end…'" he said. "I mean, that’s a pretty amazing statement for someone who was the symbol of hope and optimism in the original films. When I first read it, my jaw dropped. What would make someone that alienated from his original convictions?"

The Perks of Being A Wallflower

The Source: Again, Entertainment Weekly's preview of the next movie

Probability of Accuracy: Pretty accurate, because who knows better how uncool a character is than the actor that plays them?

The Real Deal: Wondering what role newcomer Rose (Kelly Marie Tran) will play in the next Star Wars installment? If Tran's interview with EW is anything to go by, she might just be enough of a fan to help remind the good guys what they're supposed to be doing in the first place. "Poe Dameron is super cool. Finn’s super cool. Even though [Rose] is good at what she does, she’s not known… She’s not cool. She’s this nobody, this background player, which is what makes her interesting. She’s not the best. She’s not royalty. She’s someone who is just like everyone else," Tran said.

Rose she comes into Finn's life at a point where he's questioning whether or not he wants to stay with the Resistance—and according to John Boyega, her influence helps him come to a decision. "It’s now an opportunity for him to be the best he can be. He has to make a decision, and Rose is there to help him make that choice," he teased. Is this some kind of meta-clue to tell us that it's okay to be fans as long as we keep inspiring our heroes to do the right thing? If so, I am here for this.

Take Care Not To Hurt Yourself

The Source: Making it a hat trick, Entertainment Weekly's Last Jedi preview

Probability of Accuracy: The information comes from director Ryan Johnson, so we should all hope it's accurate.

The Real Deal: Turns out that the Porgs aren't going to be the only aliens that Luke Skywalker is sharing the island of Ahch-To with; Star Wars: The Last Jedi writer/director Rian Johnson told EW that he's also going to be dealing with another race called the Caretakers. "They’re kind of these sort of fish-bird type aliens who live on the island," he said. "They’ve been there for thousands of years, and they essentially keep up the structures on the island… They’re all female, and I wanted them to feel like a remote sort of little nunnery." What do they take care of, you might ask? Well, they just might have something to do with the structures on the island, which—if speculation is to be believed—might mean that they have some connection with the Force that we haven't quite seen yet.

Snoke Gets In Your Eyes

The Source: For the fourth and final time, the Last Jedi preview from Entertainment Weekly

Probability of Accuracy: As with almost all things Snoke, this one is entirely open to interpretation…

The Real Deal: With all kinds of speculation abounding about the leader of the First Order, EW added some fuel to the fire by asking The Last Jedi director to talk about what role he does—and doesn't—play in the new movie. "Similar to Rey’s parentage, Snoke is here to serve a function in the story. And a story is not a Wikipedia page," Rian Johnson told the magazine. "For example, in the original trilogy, we didn’t know anything about the Emperor except what Luke knew about him, that he’s the evil guy behind Vader. Then in the prequels, you knew everything about Palpatine because his rise to power was the story."

So, how much of Snoke's story will be revealed in the new movie? Johnson is playing coy, saying only that audiences will "learn exactly as much about Snoke as we need to." (One thing that he would reveal, is that while Andy Serkis' character will indeed be CGI, the actor's motion-capture performance was astonishing: "It’s one of those performances where after every line, I’d look over at whoever’s standing next to me with an expression on my face like, 'Oh, my God, we just got that.'")

Everything's Perfectly All Right Now On The Han Solo Movie. It's Fine.

The Source: Future Lando Mark 2 himself, Donald Glover

Probability of Accuracy: On the one hand, he's only talking about his personal feelings, so it's hard to say whether he's being accurate or not, or even if what he's saying translates to others in the cast. But on the other, this certainly contradicts the official line about how the production is faring…

The Real Deal: Turns out, the changeover between directors on the still-untitled Han Solo movie wasn't quite as smooth as the official party line would have it. In a Hollywood Reporter profile, Donald Glover, who plays Lando Calrissian in the movie, said that Ron Howard replacing original directors Phil Lord and Christopher Miller has shaken his confidence. "Ron is such a legend, and he knows exactly what the vision for what he is doing is … [but Phil and Chris] hired us, so you sort of feel like, 'I know I'm not your first choice …' And you worry about that," Glover says. "I feel like I was the baby in the divorce, or the youngest child. The oldest child is like, 'We know what's happening, but we are keeping you out of it.' And I'm just like, [Glover's voice rises several octaves] 'Was that scene good? How did you feel?'" The question is, will anyone be able to tell the difference between performances in the finished movie, which is still targeted at a May 2018 release?

LEARN MORE

The WIRED Guide to Star Wars

Cassian Andor: Not a Big Fan of Consent, Apparently

The Source: Marvel's Rogue One spin-off comic

Probability of Accuracy: It was part of a canonical comic book story, so it's 100% accurate.

The Real Deal: The secret origins of Rogue One: A Star Wars Story's K-2SO were revealed in a Marvel comic book this past week, and the tale might have raised some unexpected issues. According to the Star Wars: Rogue One—Cassian & K-2SO Special, Cassian Andor reprogrammed the Imperial droid against its will in an attempt to avoid arrest during a mission, prompting at least one website to question whether or not there's an unpleasant rape analogy hiding just under the surface and waiting to be discovered. Well, Rogue One was always intended to be the morally murky installment in the series….

Related Video

Movies & TV

Star Wars Announces Episode VIII in Production

Star Wars Announces Episode VIII

It’s such a simple question Rachael (Sean Young) asks Rick Deckard (Harrison Ford) in Ridley Scott’s 1982 film Blade Runner: “Have you ever retired a human by mistake?” They’ve just met in Eldon Tyrell’s opulent offices, and Deckard, a replicant bounty hunter, has come to interview Rachael as a means of testing the LAPD’s replicant-detecting Voight-Kampff device. Deckard’s equally simple response— “no”—comes without hesitation; he nonchalantly shrugs it off as though he’s never bothered questioning the supposed difference between humans and the androids he’s contracted to kill. The entire exchange takes about five seconds, yet it encapsulates everything that has fueled the public’s decades-long love affair with Blade Runner’s existential dread: What are humans? What myths do they take for granted? What have they been missing?

Over the past 35 years, Blade Runner has (rightly) been lauded for its artistic legacy and chillingly prescient vision. In that time it has also often (rightly) been critiqued for its flaws when it comes to its representations of gender and race. Scott’s film is full of female characters who are all replicants, yet their literal objectification is barely explored; East Asian aesthetics pervade its vision of dystopian LA, yet Asian characters are largely background players; its cyborgs are meant to be stand-ins for oppressed minority groups, but few, if any, minorities are actually present on screen. These shortcomings have become so apparent in the decades since the film's release that Blade Runner has become a shorthand for exploring those topics, even if only to show how sci-fi stories like it can succeed or fail at addressing them. So when word of a sequel arose, the question immediately became whether or not it would update its view of humanity along with its view of the future. The answer to that question, unfortunately, is: not so much.

(Spoiler alert: Minor spoilers for Blade Runner 2049 follow.)

Director Denis Villeneuve’s Blade Runner 2049 is certainly as obsessed with the eroding distinction between human and artificial life as Scott’s film was. Its production design—in 2049, Los Angeles is so overpopulated it looks more like a server farm than an actual human habitat—is just as breathtaking as the original’s. And its technological advances, like the tiny hover-pods that allow Tyrell’s successor Niander Wallace (Jared Leto) to see or the new biologically engineered replicants grown in Matrix-like cocoons, are executed in a way that propels the franchise 30 years into the future. Yet for all the attention paid to updating the sequel’s physical details, its three-hour plot does little to concern itself with anything beyond the depths of its white male protagonists, reducing white women to tired archetypes and utterly sidelining nonwhite characters.

Ford’s reprised Deckard and Ryan Gosling’s blade runner K both have complex inner lives behind their macho reticence. K, like Deckard, doesn't think critically about his job or the replicants he executes. His demeanor remains a mask for the audience to endlessly consider in long, uncut close-up—until a revelation forces him to question his identity, and his world falls apart repeatedly across his face. Deckard describes the heart-wrenching motivations for his self-exile and the agony that has accompanied it; Leto’s Wallace monologues at length about his megalomaniacal ambitions to play god to a species that can overrun humankind. Each man gets a story, and each story gets an airing.

Despite their unrelentingly pedestrian Psych 101 woes, these three men still manage to take up 95 percent of the emotional frame on screen, leaving little room for the women around them to have their own narratives. There’s manic pixie dream girlfriend Joi (Ana de Armas), whom K has literally purchased, à la Her. K’s boss, Lt. Joshi (Robin Wright), berates him at work and then invites herself over, drinks his alcohol, and comes on to him. Mariette (Mackenzie Davis), the sex worker with a heart of gold, repeatedly comes to K’s aid (in every way you can imagine). Wallace’s servant Luv (Sylvia Hoeks) has the most tangible personality, yet she’s obsessed with pleasing Wallace. Even Sean Young’s Rachael makes a cameo as a plot device for Deckard, embodying the final archetype—the martyred Madonna—of this Ultimate Sexist Megazord. Three female characters, not one one of them voicing an ambition or desire that does not pertain to their male counterparts. Just because 2049’s future has females doesn’t mean its future is female.

Yet in a deeply ironic twist, the plot itself hinges entirely on their presence; without women, be they human or replicant, the secret K discovers that sends him on his harrowing mission wouldn’t exist. If Villeneuve and screenwriters Hampton Fancher and Michael Green recognized this, they must have ultimately decided they could accomplish the same goals without having to imbue those critical female characters with the same humanity as their male counterparts. (After all, when a character’s function is more plot point than passion, why bother giving it stage directions?) Several moments almost comment meaningfully on women’s disposability—Wallace’s casual gutting of a newborn female replicant; a giant, naked Joi addressing K blankly from an ad—yet each time, they become sad moments in a man’s narrative, rather than being recognized as tragedies for the women themselves.

Unsurprisingly, the problem worsens when viewed through a racial lens. Gaff (Edward James Olmos, now in a retirement home), a lab tech (played by Wood Harris), and two shady black-market dealers are the only men of color with more than one line; none have identities beyond their use to K. The only visible woman of color is one of Mariette’s nameless coworkers. (While de Armas was born in Cuba, her grandparents are European.) The third act finally delivers a plot twist that insists the story is not actually about K and Deckard—except the action continues to focus on them anyway.

As critic Angelica Jade Bastién recently noted at Vulture, mainstream dystopian sci-fi has always been obsessed with oppression narratives. While it returns over and over again to the downtrodden-rises-up-against-the-subjugator model, the genre has always had a remarkable ability to overlook the persecuted groups—people of color, women, the LGBTQ community, people with disabilities—whose experiences it mines for drama. White creators, men in particular, tend instead to whitewash their casts, imagining themselves as both villain and hero. Rather than simply putting the real thing in the story, their tales become metaphors for the real thing. Blade Runner 2049 falls into this trap: Even as Wallace grandstands about "great societies" being "built on the backs of a disposable workforce," everyone the movie deems powerful or worth exploring is still white and almost 100 percent male, relegating those disposable workforces’ descendants to the story’s incidental margins.

This was one of countless missed opportunities Blade Runner 2049 had to transform the franchise into not just a staggering aesthetic and technological achievement but also an incisive read of 21st century society. In the wake of Mad Max’s recent overhaul—which, while imperfect, managed to redeem many of its predecessors’ flaws—this misstep is especially disappointing. Yet like Deckard’s hurried brush-off of Rachael’s honest question in 2019, 2049’s filmmakers have attempted to tell a story about personhood in 2017 without actually considering the urgent politics that surround who gets to be a person in 2017. And in an era that begs for this kind of reinvention, its failure flattens its message into one more retirement for the books.

Replicant Required Reading

  • Inside the dark future Blade Runner 2049
  • WIRED's Blade Runner 2049 review
  • Director Denis Villeneuve remembers seeing the original Blade Runner for the first time

Related Video

Movies & TV

Blade Runner 2049 Director Denis Villeneuve on Seeing the Original for the First Time

Blade Runner 2049 director Denis Villeneuve reflects on seeing the original Blade Runner on the big screen and why he still loves the voice over version.

You already get a Star Wars movie every year. Star Trek is coming at you from at least two directions. A good chunk of the Marvel movies are basically space opera. Bigscreen fascination with science fiction and fantasy is nothing new—but now you can add the many flavors of TV network, from legacy broadcast to basic cable to streamers. Forget comic books; somehow, SF/F novels have become Hollywood’s hottest IP.

Some highlights of what may be on its way: Amazon is making Neal Stephenson’s Snow Crash and Larry Niven’s Ringworld, and a show based on Terry Pratchett and Neil Gaiman's Good Omens is in production. Universal is doing Hugh Howey’s Sand, Maggie Stiefvater’s Raven series, Kurt Vonnegut Jr.’s Sirens of Titan (with Community genius Dan Harmon at the helm), and Roger Zelazny’s Lord of Light, a.k.a. the movie that the CIA pretended to be making when they rescued American diplomats from Iran—a story that was itself the basis for the movie Argo. Done yet? Nope! HBO is making Nnedi Okorafor’s Who Fears Death? Netflix is making JG Ballard’s Hello America and Daniel Suarez’s Change Agent. And Lionsgate is making Patrick Rothfuss’ Kingkiller Chronicle series.

Phew.

Sure, some of these shows will never make it out of development; others won’t last a full season. Runaway successes like Game of Thrones and The Handmaid’s Tale make it worth the effort; Hollywood is nothing if not willing to keep doing what already worked. Besides, the saga of Westeros is about to wrap up, and then who'll take the reins of fire?

All those various networks and services have offered up plenty of original sci-fi, too, of course. Stranger Things is the obvious standout there. And I’ll do you a solid and point you at Travelers on Netflix, a wickedly clever time travel show from Canada, due for a second season this year.

Some deeper incentives might be behind the raiding of the science-fiction section, though. Familiar intellectual property has two advantages for a TV network. First, it’s already vetted. An editor with experience in science fiction has already made the sign of the IDIC over it and fired it out of a photon torpedo tube. Its characters, its world, and at least the skeleton of its plot live in the fictional universe.

As a consequence, TV makers don't need tea leaves—they can just look at Bookscan and get a sense of how big their built-in audience is. That’s never certain, and the number of people who read Snow Crash, genius though it is, probably isn’t big enough to turn it into a massive hit. But then again, viewership and ratings don’t have the same importance they once did, especially for streaming services. The Amazons, Hulus, and Netflices of the world are more interested in prestige, in must-subscribe programming, and—as a proxy for those other two things—maybe even in Emmys and glowing press coverage. (Quick counterpoint: Syfy’s The Expanse, likewise based on a successful series of books, hasn’t gotten the same attention or audience—which is too bad; it’s cool.)

Plus, genre often ends up being easier to market than "straight" or more literary drama. Take The Man in the High Castle, based on the Philip K. Dick novel. "What if the Nazis won World War II?" Pow. Now try to find a logline like that with, say, Atlanta.

Meanwhile, the larger multiverse of IP sources has become more constrained. The worlds of comic books and big sci-fi franchises are bottled up and decanted among various studios already. (That said, I’m super-psyched to see Greg Rucka and Michael Lark’s Lazarus and Ed Brubaker and Steve Epting’s Velvet comic books, from the creator-owned imprint Image, getting TV development deals, too. Lazarus is political, dystopian sci-fi and Velvet turns a Ms. Moneypenny-like character into a superspy on the run.) Demand for overall television content is high. Demand for genre content is high. The choices are: Either take a risk and make original stuff, or take slightly less of a risk with a known quantity from somewhere. Anywhere.

Movies used to be the only medium that would try to adapt something as ambitious as a book. In sci-fi and fantasy, sometimes that’d give you the Lord of the Rings trilogy (yay!) and sometimes it’d give you The Dark Tower (um). But now that premium television makers have realized that it’s easier to unspool the stuff books are best at—details, set pieces, emotional journeys—in six or 10 or 15 hours than in a mere two. TV’s looking for books; books work better on TV. For the kind of people who roll science fiction books on their Kindle like a development agent rolls calls during casting season, this middle is a very good one to be caught in.

Small-Screen Sci-Fi

  • Charlie Jane Anders on how The Expanse is transforming
    TV.
  • Adam Rogers on CBS' challenge in turning Star Trek into a
    streaming-exclusive prestige
    series.
  • Brian Raftery on The Handmaid's Tale transforming Hulu into a
    prestige-TV
    heavyweight.

Related Video

Movies & TV

Ridley Scott's Top 5 Sci-Fi Films of All Time

"The Martian" director Ridley Scott counts down his favorite sci-fi films of all time.

It's Not Just You. TV Has Hit Peak WTF

March 20, 2019 | Story | No Comments

*Note: This story contains spoilers for the current seasons of *Twin Peaks: The Return and American Gods.

Diane, I'm lost: For the last week and a half, I've been been wrapped in the spastic world of Showtime's Twin Peaks: The Return, the new 18-hour series from gleeful zig-zagger and noted nightmare sommelier David Lynch. Though only four hours in, The Return is already my favorite new series of the year, full of exquisite terrors, concussion-quick humor, and lovingly reintroduced old faces (surely I'm not the only fan who gets near-weepy whenever a Peaks pal like Bobby or the Log Lady wanders back into view after all these years). Here's a show in which Kyle MacLachlan gets to apply his parched comedic skills to, at last count, three different iterations of Dale Cooper; in which an akimbo-limbed tree speaks via a fleshy, bobbing brain-maw-thingee; and in which Michael Cera lisps his way through an impression of a Wild One-era Marlon Brando. Each episode of the new Twin Peaks is a dark hoot of the highest order, and I feel like Mr. Jackpots just for being able to watch them.

Related Stories

I should also mention that, for the most part, I have no idea what is going on with this show. There are a few plot outlines you can pick up with each installment—like all of Lynch's work, The Return has a forward momentum, even if it's not always discernible—but every new sequence in Twin Peaks sires at least a half-dozen questions: What was the space-floating metal contraption that Cooper visited? Who was the oil-slicked, ghost-like figure sitting in that South Dakota jail cell? Will we ever gaze upon Sam and Tracey's mysterious glass box again? (Speaking of which: Rest in Peaks, Sam and Tracey! Unless, of course, you're somehow still alive, which I suppose is very possible at this point!)

These are the kind of queries that might normally send one to the internet, where an ever-teeming team of culture experts—re-cappers, explainers, best-guessers—stand by every Sunday night at 10:01 pm, ready to tell you What Just Happened. In some ways, this kind of forensics-first approach can be traced back directly to Twin Peaks. In the early '90s, when the first series debuted on ABC, it was an instructive bit of deconstructionism: A network series made to be discussed and scrutinized, with Lynch and co-creator Mark Frost inspiring questions about the show's narrative (Who killed Laura Palmer?) as well as the medium itself (If a TV drama can get away with flashbacks and dream sequences, then what else can you get away with?). The modern web didn't exist when Twin Peaks premiered, but those sorts of thoughtful debates both predicted and informed the way we'd eventually talk about TV online.

But in the case of Twin Peaks: The Return, such well-intentioned hunches feel pointless—and, at least for me, totally joyless. Twitter and Facebook have made it possible for us to lob opinions and theories as soon the closing credits roll, but they've also made it tough to become giddily, utterly caught in the grasp of an immersive piece of art. Twin Peaks: The Return has that sort of lingering hold, in part because it's such a goofy marvel to behold, but also because the episodes never come to any sort of definitive conclusion. Instead, they simply float away on the back of a Chromatics synth or some Cactus Brothers harmonies, then remain lodged in your subconscious while you await the next Return. Asking "Wait, what's really going on here?" in the middle of this of ever-rare reverie feels like the exact opposite of curiosity. After all, the more you know what's going to appear in the big glass box—my lazy box-as-TV analogy, not Lynch's—the less face-chewed you'll be by the results.

Such let's-just-see-where-this-goes immersion is also one of the pleasures of the *other * current Sunday-night TV fun house: Starz's American Gods. Unlike Twin Peaks, Bryan Fuller and Michael Green's Gods—which depicts a planet-threatening power-struggle between its titular titans—is theoretically knowable, having been adapted from the 2001 novel by Neil Gaiman. For anyone who, like me, finds themselves occasionally confounded by the show's expansive cast, metaphysics-bending laws of nature, and pop-culture riffage, there are countless *Gods *primers and decoders available online.

Yet it's hard to imagine why anyone would even want to figure out everything that's going on in this gorgeous brain-poker of a show. On Gods, heroes and villains are introduced without hand-holding or hand-wringing; they simply show up and plunge into the action, their greater motivations sometimes not made clear for weeks, if at all. The world of Gods is less surreal than the world of Peaks, but it's just as ravishingly cuckoo: There's a bar-brawling leprechaun and a literally firey-eyed taxi driver; a morph-happy god, played by Gillian Anderson, who inhabits the form of Lucille Ball, Marilyn Monroe, and Aladdin Sane-era Bowie; and a roof-dwelling watcher who, at one point, plucks the moon out of an impossibly perfect nighttime sky. To get an idea of just how strange this show is, consider that one of American Gods' most relatively normal characters is played by Crispin Glover.

And, like Twin Peaks, the stunning-looking Gods casts a small-screen spell that's best left unexamined, in order for it to remain unbroken. The show has a hard-to-pin rhythm—some characters appear only in brief vignettes; others, like Ricky Whittles' Shadow Moon, can take over entire episodes—that allows Green and Fuller to make each episode as narratively elastic as possible. And every scene in Gods is saturated in a daring, beyond-garish visual palette that makes its most out-there moments (like a nighttime desert-sex scene between two fire-consumed, liquid-skinned men) all the more grandly hypnotic. To stray too far away from this version of Gods by wiki-seeking every still-undisclosed detail would be like pinching yourself awake during an especially dazzling dream. For now, the best way to enjoy both Twin Peaks: The Return and American Gods might be to surrender to their utter weirdness, maybe even try to savor it—ideally, with a piece of pie and some damn good coffee.

The question was simple enough: “Where the hell is Kendall Jenner with her Pepsi 12 packs?” As terror enveloped Charlottesville during the “Unite the Right” rally on Saturday, where hundreds of neo-Nazis and Klansmen violently mobilized in a mutinous showing of white pride to challenge the removal of confederate statues, that was all it took to resonate with tens of thousands. Though cloaked in humor, its meaning cut deep, the mockery of Pepsi’s ill-conceived April TV ad distilling how racial and social harmony in America remains something of a joke.

The reaction was neither unexpected nor the exception—on Twitter, the binding of anguish, cynicism, and satire has become a shared lingua franca in the wake of national torment.

I spent the weekend in Cleveland, mostly offline, celebrating a friend's wedding. What news I did consume, as I attempted to make sense of the Charlottesville protest turned melee, came via Twitter: images depicted ghoul-faced white men hoisting torches in medieval defiance, of bodies being launched into the air, and of extremists bludgeoning Deandre Harris, a young black man, as he fell to the concrete. There were people who assembled to honor the life of Heather Heyer, the 32-year-old woman killed during the rally’s fallout. Tweets, too, memorialized the names of individuals like Heyer who were victims of recent incidents grounded in gruesome racial contempt.

In spite of Twitter's capacity to rapidly disseminate news like few other digital rostrums, many of its critics (and users) fear that the social hub has largely become a cesspool of snark and nonsense; “everyone is bad” and “never tweet” are frequent rejoinders to the madness that the platform has been known to spur. In the wake of an occurrence like Charlottesville, one of such ferocity—when deep hatred, out-and-out racism, and death foment in disastrous concert—Twitter becomes a conduit for externalized grief. My feed turned into a tempest of trauma and humor, an emotional battleground yoked by what writer Hilton Als once solemnly described as “a reality I didn’t want to know.”

Rampant denial of America’s past—with its history of slavery, redlining, and mass incarceration— led to tongue-in-cheek tweets about shallow white allyship and President Trump’s equivocating “on many sides” statement. In the aftermath of homegrown extremism or tragedy, a new contagion often emerges, built on innocence and patriotism; stock expressions like “I can’t believe this is happening” or “We are better than this” become a handy, if fictitious, narrative. But on Twitter, such sentiments are relentlessly met with a chorus of ridicule and truth.

There’s also the exhaustion many feel at having to address white ignorance in moments of racial conflict.

The jokes, and their surprisingly codified form, are all part of the Twitter cycle—the news and the grief never outweigh or dilute the humor, and vice versa. And for many of Twitter’s marginalized participants, be they black or gay or female, humor functions as a necessary safeguard. “The absurdity of reality—the only way for many people to deal with that is satire and comedy,” says Charisse L’Pree, who teaches media psychology at Syracuse University.

L’Pree believes microblogging platforms like Tumblr and Twitter have become overwhelmed with people’s desire to share, which often escalates in times of crisis. “This is a process by which we cope with tragedy when there is a very visible white power movement happening,” she says of tweets like Jenkins’. But she is careful to clarify: “It’s a pretty standard reaction; we just see the evidence more because of social media. But the frequency of retweets demonstrates that what this person is saying is actually reaching and resonating with people.”

How we’ve learned to shoulder the terror, in the real world and the world we construct online, is perhaps the most telling. The footage and first-hand accounts out of Charlottesville have been at once crushing and completely unsurprising, a reflection of an America many have endured before and are likely to endure again. For those geographically removed from the mayhem, Twitter acts as a necessary remedy. It is a coarse statement, but a no less true one: humor soothes.

Early Sunday morning, I came across an image on my timeline. The photograph had been taken a month before, at a similar "save the statue" Klan rally, but it couldn't have felt more timely. A black man in jean shorts sits somewhere in the vicinity of the protest, smoking a cigarette. He looks untroubled, calm. Wholly unbothered. In his hand he holds a sign that reads “Fuck yo statue.” I laughed for a second, thinking of Chappelle’s Show’s watershed sketch about funk musician Rick James, and then I did what I always do. I kept scrolling.

The anatomy of a Little Simz song doesn’t offer itself up easily. On the recent “Good For What,” a whir of tough North London bristle, the rapper born Simbiatu Ajikawo contemplates early successes. “Look at young Simbi in Vogue/Look at young Simbi in Forbes,” she says in the video, merrily skating through the streets of Los Angeles. “Well someone’s gotta do it right/Someone’s gotta open doors.” As British rap has found more footing in US markets over the last handful of years, taking a bigger share in the international mainstream—partially owed to a greater cross-cultural exchange among artists themselves—Simz has come to represent how music can best travel among divergent cultures in our increasingly globalized world.

One argument, familiar to anyone privy to the nativism of Donald Trump and his ilk, contends that globalization actually dilutes local cultures. In popularizing the customs of a given community, the thinking goes, these things in some way lose their truer essence. We've seen that argument play out with Drake, whose ardent obsession with UK culture has granted artists like Jorja Smith, Skepta, and Giggs more mainstream visibility in North America. Simz’s rise is, in part, a rebuke to that thinking, taking the independent route with the creation of her own label: she proves that the best conduit can still be the self, even when it’s away from home.

Metaphor or not, “Good For What” finds the young Brit looking back even as she pilots forward, taking solace in the palm-treed environs of Southern California but still every bit the girl who grew up under the metal skies of Islington. The music, all backbone and unflinching emotional lucidity, may have changed locations, but it’s remained unmistakably Simz, a posture that is no mere performance. “I was made for this shit,” she declares, over Astronote’s murky, atmospheric production. And later attests: “Cause this is bigger than you thought/Thought I was finished, let me give you more.”

“More” has never been a problem for Simz. She is a covetous creator, having collaborated with artists like reggae revivalist Chronixx, Rihanna songwriter Bibi Bourelly, and soul experimentalist Iman Omari; toured with the Gorillaz; and dropped 11 projects since 2010 (a blend of albums, mixtapes, and EPs). There’s been no disconnect in Simz’s presence stateside, either. When she initiated one of the several freestyle cyphers at the BET Hip-Hop Awards in mid October, she did so as the only black woman artist hailing from the UK. With a pinch of English cool, the 23-year-old rapper spoke of her adolescence and the tirelessness it took to overcome the likelihood of turning into just another cultural data point. “Who’d thought this would happen/ teachers would tap me funny when I said I’d make it from rapping,” she offered in the minute-long verse.

Though the cypher included brash ascendants like Detroit's Tee Grizzley and Atlanta sing-rap polymath 6lack, Simz held court like a seasoned pro: dynamic and levelheaded, if exceedingly expeditious in her layered delivery. Her authority carries little surprise to anyone who has followed the young rapper's continued climb, gaining traction in the US since issuing her seductively ruminative E.D.G.E. EP on SoundCloud in 2014 (the breakout track “Devour” has since amassed 3.65 million streams).

Still, the most radical element of Simz’s arsenal may be her grandiosity. A song like “Good For What”—with its puffed-up moxie and tales of shrewd diligence—provides another roadway into her appeal by better refining the many avatars she dons so effortlessly, accentuating the social realities of black women. Simz’s sustained output has also allowed her to be even more elastic in her selfhood. There is a vulnerable intensity alive in her work; it satiates but jars the soul, lines so ordinary you forget how much power they hold in one’s own life. “My imperfections make me who I plan to be” she sang on “Doorway + Trust Issues,” from January’s Stillness In Wonderland (the deluxe edition, which features seven new songs including “Good For What,” releases November 4).

The final shot in the video for “Good For What” zeroes in on Little Simz, standing by herself in the middle of a nondescript LA street, the line “Look at me, once again I was made for this shit” looping in the background. The message is unmissable: no matter where she’s at, it’s best we leave the translation up to her.

More From the Author