Throw out your TV. A new age is upon us, and its name is 4K. Yes, it’s true, millions of people did just buy new high-definition TVs, but their time is already past. 4K, a new standard that offers a picture four times as sharp as lowly “HD,” and is richer and more vibrant, too, is the undisputed way of the future.
Such is the rhetoric of those pushing 4K, anyway. At the Consumer Electronics Show in Las Vegas earlier this January, Sony, Samsung, Netflix, and others could talk of little else than how 4K is the next step in gadgetry. Press releases praise the “immersive and dynamic entertainment experiences” possible, and a Sony exec even claimed it enabled new artistic possibilities.
I was skeptical, but having had the chance to see 4K, I can confirm: it truly is stunning. Though I was expecting the marketing hype to be just that, its clarity is astounding, almost absurdly so. It isn’t so much that it “looks real” as the fine detail and depth seem hyper-real—as if there is somehow more in the landscapes or artfully arranged bowls of fruit on the screen than in the objects or places they depict. It’s remarkable.
And yet, my skepticism lingers. That quest for ever-more realism and immersion has become a driving force in the world of modern digital technology. At its core, it relies on the assumption that what we want out of art and entertainment is to be sucked into it—to become so subsumed by the spectacle and realism of the fictional experience, that we are “in it.” To wit, in that meeting point of tech and Hollywood, everything is becoming more and more like a video game.
I am no fan of this trend, though I should be clear I don’t say this to denigrate games. Quite to the contrary, I’m an ardent defender of the aesthetic and intellectual capacity of interactive entertainment: I think games’ greatest potential lies in their tendency to ask their players to identify as their fictional protagonists—to feel what it might be like to be another person.
On the other hand, film and other traditional narrative forms instead always expected us to empathize—to identify with, instead of as, a character. At the root of that phenomenon is the aesthetic frame of fiction itself. It is the weirdness of an omniscient narrator in a novel or the position of a camera in a movie scene that both alerts us to the fictiveness of what we’re experiencing, but also delimits it, pointing out the edges of where this particular world ends. That boundary that separates fiction from reality allows us to consider narratives and characters at a distance—to weigh what we think of their choices, depictions, and the ideologies that speak through them, that frame allowing us to arrange them like pieces on the messy, three-dimensional chess board of our mind.
What the push for realism and immersion demands, however, is that we try and put the viewer in the frame, rather than just outside it. It’s part of a broader trend that includes 3D film, in which movie makers end up focusing on the immersion and showy FX that, with the viewer at the their centre, are more like a theme park ride than film. That helpful boundary that indicates to us that we are watching fiction is made more fuzzy, but to no particular purpose, as if the only point of 3D or a higher resolution picture is to impress and dazzle, rather than tell or inflect a story in a different way. It seems to takes the model of gaming which works when the player is in control of the screen and puts it where it doesn’t belong, conflating visual depth for a depth of story or world-building.
I’d rather not be the sort of person who decries new visual technologies, dismissing them simply because they are novel. In fact, I believe that 3D film, 4K, virtual reality, and other innovations could be put to remarkable artistic ends—but only if they’re used well. For example, experimental films such as the dance piece Pina suggest that works that approach 3D not just as a gimmick, but with a mind to uniquely 3D cinematography and style, might be brilliant. What is currently happening, however, is that the ideology of immersive realism simply sticks the video game model into film with little regard for artistic novelty.
When the first and second Hobbit films were released recently, audiences were surprised by the tandem of 3D and a new, smoother motion called HFR. It inarguably looks more real, and is much easier to track during fast-paced 3D scenes, but it looks cheap, too, like an old BBC documentary. What’s more, due to the 3D effect bringing the picture closer, in a profoundly odd way, it also feels inappropriately intimate. It is as if you walk into a cinema expecting fantastical grandeur projected on a gigantic screen, and you’re instead seated in an avant-garde play in which you’re at the table during a dinner scene. In order to draw audiences within the scene, the awe and wonder of film’s sheer “bigness” has been diminished.
4K need not necessarily be part of the same problem. It really does look incredible, and a more detailed picture is neither inherently good nor bad. But as Hollywood and the tech world collaborate more, it seems that delicate balance between art and the tools with which it is made is shifting in favour of the technological. After all, though 4K’s salespeople claim that the new standard is like “looking out of a window,” we forget we also often need that frame to make sense of what we see—that rather than simply being transparent glass, art is instead meant to hold up a mirror.