Monarch: Legacy of Monsters and the Cinematic Universification of TV
TV and film have merged into the ultimate kaiju.
Photo Courtesy of Apple TV+
Back in my day, things were simpler. Milk built strong bones. Dogs were dogs and cats were cats. Darius Rucker made rock music and Jordan played basketball. And movies were sounds and images projected on a big screen that you watched in what were called “theaters,” and TV was sounds and images playing on a 400 lb. cube in your living room all evening until your parents told you to scram so they could watch Tony Soprano beat the hell out of some poor Newarkian. Everything made sense, and the world was at peace.
Then shit got weird. Suddenly, this DVD delivery service was making its own TV shows, and they were… good? And also you could watch them—commercial-free, might I add—via “wireless internet,” no discs or cable cords required? And then, perhaps strangest of all, characters on some of those Internet TV shows would reference characters and events from movies that were currently playing in theaters? And, like, suddenly milk doesn’t even have that much calcium and is apparently bad for your stomach??? What was going on!?
The sun has set on the good ole days. The streaming era, the golden age of TV (or whatever we’re in now), and the “cinematic universification” of Hollywood franchises have all worked in conjunction over the past decade or so to bust down the fences that once clearly demarcated the mediums of television and film. As studios rush to capture their audience’s attention in any and every way possible, there’s been a recent uptick in programs that merge our two primary forms of visual entertainment through overlapping storylines and broadened rules of what constitutes “episodic” TV. It’s confusing, it’s ever-changing, it’s maybe inevitable. It’s also, as Apple TV+’s Monarch: Legacy of Monsters indicates, not necessarily a bad thing.
To address the obvious first: of course, we know when we’re watching a TV show and when we’re watching a movie—for the most part. Sitting at home vs. in a theater can be quite telling (they don’t call it “going to the movies” for nothing), but there’s also that whole matter of TV shows having episodes instead of existing as standalone features. Plus, the bleeding of one property from the small screen to the big screen and vice versa isn’t an entirely new phenomenon. While 1950s Hollywood held tight to its gripes about the artform of cinema being separate from that philistine hogwash known as television, overseas, the comedy It’s a Great Day! spun off from the widely successful soap The Grove Family and released theatrically in Britain in 1956, making it the first TV-to-film adaptation. Hollywood would eventually figure out that a lot of people watch both TV shows and movies, so popular stories and characters in one medium, such as the daily lives of doctors on M.A.S.H. and the investigations of the FBI’s “X-Files” division, could be popular (and lucrative) stories and characters in the other.
How we have arrived at today’s spirit of televisual mishmash is largely a result of technological advancement. It’s not just the mediums themselves changing but rather how we consume them. Theater technology has only gotten better—the replacement of traditional projectors with laser units, worn-out speakers with sound systems like Dolby Atmos, and even crusty old seats with smooth leather recliners have been a few ways cinemas have tried to reel in easily-distracted homebodies to the multiplex. However, the rate of these improvements doesn’t hold a candle to how much TV has advanced. 4K flatscreens and home speaker systems are more common (and cheaper) than you may think, not to mention the notion of everyone now being able to watch entertainment on phones and tablets and computers as well—either at home or on the go. (The pandemic didn’t help matters, of course, as theaters that were already facing existential uncertainties returned record-low profits as people stayed safely on their sofas.) But it’s not just that TV was getting easier to watch; it’s that it was getting better.
Premium cable and streaming’s rise perhaps are most culpable in muddling the divisions between TV and film. The subscription model revoked the need for ad breaks, strict 22 and 44-minute TV episodes, weekly releases, FCC censorship, and demographic-spanning entertainment. This disruption granted more stories to get told but at typically shorter intervals, creating a paradox that this year’s guild strikes have reminded us is far from stable for working writers and actors. At the same time, it’s allowed for more inventive plays with form, higher budget allotments, and some truly epically-scaled events for the small screen to emerge (the Game of Thrones and Stranger Things Season 4 finales have stuck with me to this day, for different reasons). In a word, TV shows grew more cinematic.
The realization that television, rather than film, was the media avenue in which the creative possibilities were expanding then attracted Hollywood’s attention even more. A-list stars and directors like Meryl Streep and David Fincher started doing TV, upending the mores that talent only started in television to claw their way to the silver screen. Netflix, which initially made its name in the business of movies, began to purchase the distribution rights of theatrical releases and festival favorites, and then they started producing and dropping their own films on the platform. The divisions between TV and film were and still are fairly established, but think about it: when your average Joe gets home after a long day and boots up the app, what he first sees scrolling through his options is not “TV” or “Movies” but rather headings like “New Releases,” “Popular on Netflix,” and “Trending Now.” Behind these nebulous labels lies a broader, more ersatz category: content.