 Our obsession with graphics becomes salient at the start of every console generation as we collectively cry out for more pixels, particles, and panoramic vistas. We hear the buzzwords, more immersive worlds, more creative expression, more artistic freedom, afforded to us by hardware designed to look like a fridge and by corporate representatives heralding a new era of digital entertainment. It's thrilling in many ways to be at the forefront of new technological frontiers, it is the Christmas of the Information Age. But our obsession with graphics comes at a cost, literally, in that games are becoming more and more expensive to produce, and figuratively, in how our fixation on the superficial main fact be stifling our creative instincts. We have limited time and resources in development, and so energy spent on graphics is less time spent on novel gameplay, thematic depth, or genuine technology integrated design with things like AI or dynamic environments. The ironic thing about the frontier is that it is also transient, it's a treadmill sustained by spectacle. Half-Life 2 was astonishing when it first came out, but now the game shows its age and the novelty of our first foray into 3D graphics, however arresting it was at the time, reveals polygonal monstrosities that would remain an eyesore for eternity. But there are games that still look good, and what did they emphasize? An aesthetic over technical fidelity. Let's take the year 2001, games like Grand Theft Auto 3 and Devil May Cry came out. Technical marvels for the time, were games that have aged quite a bit. In the same year though, Eco and Raz came out. Games with impressionistic styles, abstract imagery, and poignant iconography. They were beautiful, but they still are beautiful. Their elegance and charm and sensory delights did not just signal a new technological paradigm, they were rooted in realizing an artistic vision. However, technology was still necessary to render their unique symbolic worlds. We can't be Luddites with regards to rendering technology, because much of our artistic ambition could not be realized without it. It's a simple distinction, but art style and graphics are not the same thing. They are both instrumental in crafting a cohesive aesthetic. So what is Eternal then? Games that have stood the test of time for hundreds of years like chess are as abstract as they can be, stripping away representation to unearth the pleasures of strategic engagement. Even present-day games that are most universally appealing seem abstract as well. Minecraft and Tetris are deep systemic games that have crossed nations and creeds. But again, abstraction is not some platonic ideal that demarcades pure design. Representation, art style, and graphics can be leveraged to lend credence to an alternate sensory world. To take another game that came out in 2001, Metal Gear Solid 2, it leveraged technology and a postmodern ethos to craft an aesthetic that only gains relevancy as the years go by. Gameplay and graphics and art style and narrative fused to create a whole that is more than the sum of its parts. Ultimately then, it is in the experience that games craft, whether abstract or otherwise, that games attain their true calling. David Jaffe, the creator of God of War, put it well in a talk of his. As game designers and game creators and even game players, the language is play mechanics and I don't feel, for me, that those are as represented as they should be in the games that I want to make. And so I was surrounded by either games that were really artistic and meaningful, but were more about their meaning and their artistry than what my brain was doing in terms of with the controller, or what my brain was doing with the metagame, or were surrounded by games of which I have made one. I don't think God of War is going to survive in 100 years. I don't. I think it's all about execution. It's a brilliant great execution game, but it's a pure piece of mechanics and it was never designed to be. But it occurred to me that I also look on the big side of games and it's so much about the experience or the set pieces or the cut scenes or the IP. All those things are great. I'm not knocking them. But I think in a lot of cases, for me, they, as a player, they have come at the expense of what I love about games, which is the purity of this medium, which is gameplay. So we seem confused then, prioritizing superficial things rather than the true aesthetics of play. However, what is this confusion rooted in and is it something deeper than just a culture obsessed with hardware performance and looks? The term is called the immersive fallacy and it is the idea that rendering an alternate reality in higher fidelity to enable escapism is the ideal ambition for digital entertainment. Our obsession with graphics then becomes a symptom of a deeper desire to strive for photorealism, for technological fetishism, and ultimately, for escapism. In the book Rules of Play, Katie Salem and Eric Zimmerman warn us against this obsession with immersion. A common goal for digital media established by Janet Murray was the idea of the holodeck of Star Trek fame, a device that allowed users to create dynamic story worlds to escape to. The problem with this vision, though, however noble, is that it neglects what makes digital media interesting and misunderstands what immersion actually is. Immersion is not about escaping reality, but interfacing with an alternate one that can yield new insights about the world we already inhabit. In their book Remediation, the authors explain how a medium has a mediacy, a visceral impact, but also hyper-mediacy, or an awareness that even though we are in alternate worlds, we are aware that it is simply fictional. Applied to games, we understand the feedback of game feel, but also an appreciation that play happens both intimately and at a distance. We realize they are just games. This creates a double consciousness in the mind of players. They inhabit multiple worlds simultaneously. A fact clued into in books like Half Real by Jesper Eau. Through this lens, we see a deeper problem with their obsession with graphical fidelity, though. An analogy that can best help us is perhaps art in the Renaissance, where replicating reality was seen as a virtue. Of course, art is not just about replication, but abstraction, manipulation, impression, expression, and interpretation. Art didn't stop at the Renaissance. It has roots in the symbolism of old stories and diverged into impressionism, modern art, the avant-garde. Our medium's obsession with technical fidelity is perhaps an art form stuck in its own Renaissance, trapped by the images it is trying to replicate. One of the pioneers of immersive sims, Warren Spector, said this about the immersive fallacy. The history of media seems pretty clearly a march towards ever more faithful approximations of reality, from the development of the illusion of perspective and painting, to moving pictures, to color-moving pictures with sound, to today's immersive reality games like Quake and System Shock. Is this progression inevitable, and will it continue, or have we reached the end of the line? Zimmerman and Salem go on to critique what the immersive fallacy is rooted in. They state, the immersive fallacy is symptomatic of contradictory ideas about technology. On the one hand, there is a technological fetishism that sees the evolutionary development of new technology as the saving grace of experience design. On the other hand, there is a desire to erase the technology, to make it invisible so that all frames around the experience fall away and disappear. The core problem with immersion as an ideal, according to those who advocate this position, is that it misunderstands what digital media and video games are, and further misunderstands things about the nature of play. The easiest rebuttal here is we understand that play has different aesthetics. There are different reasons why individuals and cultures play, and so technological fetishism and reality replication are about a narrow spectrum of the true aesthetics of play. Furthermore, in Gregory Bateson's work, he argues that play is metacommunicative. It requires we recognize that it is both a lie and the truth. When we play a game, we take it seriously, but we also know it is a game. Thus, we need metacommunicative signals, the ability to signal what we are doing is not real, like how a dog wags its tail to signal that it isn't really a threat. This recognition, necessarily, requires hypermedia. They conclude, if game designers fail to recognize the way games create meaning for players, as something separate from but connected to the real world, they will have difficulty creating truly meaningful play. What about the joys of simulation, says the critic, and why can't escapism be its own virtue? Games like Flight Simulator show us the joys of virtual replication, not just visually, but also mechanically. There is a resplendent joy to the fantasy of simulation that is not necessarily incompatible with hypermedia C, metacommunication, and impactful experience design. People who play simulators are hardly unaware that what they are doing is a simulator, and the fidelity of the representation is what makes the game so groundbreaking. It is the most robust simulation of our planet, ever attempted in video game form. Also, there is a second layer to replication, that being mechanical. With a game like Gran Turismo, people not only expect realistic graphics, but also buy peripherals to replicate the mechanics of driving a car. Again, everyone understands this to be a simulation. The immersive fallacy also cropped up in a different form around the start of the PS3, Xbox 360, and Wii era. Sony and Microsoft went for graphical fidelity, showcasing their systems as resource-intensive beasts that could render polygons in unparalleled ways. Nintendo, on the other hand, came out with the Wii, emphasizing novel motion controls and mechanics over our traditional obsession with graphics. The story that unfolded is now one of myth, as the Wii sets sales records, leaving its technically superior consoles in its wake. Fun, whimsy, and peripheral invention won out, and it won perhaps because it recognized that there is a physical reality to the console that mediates between two realms. Nintendo escaped the trap of the immersive fallacy and made a console that expanded into a new casual audience. But of course, many gamers dismissed the console as insufficiently hardcore, and in the long run, which of these philosophies really won? What's interesting though is that around this time, a suite of games came out on next-gen hardware that appreciated the new experiences they could create with more isomorphic parallels to real-world behavior. However, players did not respond to them the way they did with the Wii. Grand Theft Auto IV took a more serious, somber and grounded approach compared to its predecessors, with more realistic physics, a more thematically rich world, and a toning down of the ridiculous antics of the past. Critics loved the game at the time, but players were perhaps not as emphatic. It was not as fun as previous entries. The same critique was levied against games like Far Cry 2, which tried to tell an earnest video game reconstruction of Heart of Darkness, an assassin decreed with a stunningly realized rendition of a different era in history. Again, a common criticism was levied. These games are not fun to control. They have laborious mission design. They are dreary and oppressive and just not enjoyable to play. Despite the fact that these design decisions were in service of an aesthetic, and with an understanding that the designers of these games perhaps didn't perfect the formula, each of these theories turned away from these ambitions in their next installment. Far Cry 3, Assassin's Creed II, and GTA V were more gamey games. They conceded to the gaming audience's demands, not for immersive replication, but just pure fun. It is perhaps sad that people didn't realize that mechanically inconveniencing the player is justified if that is the aesthetic that the designer is going for. But then again, maybe the designers got it wrong assuming that games were anything more than children's playthings. So what do we do with these disparate strands of video game history? As Salem and Zimmerman argue, understanding this trajectory we are on might help us design experiences we have never envisioned. Sure, there is the joy of simulation and visceral input, but it is in games hypermedia see that perhaps new narrative experiences can be conjured. If we take a game like Metal Gear Solid 1 and the brilliant Psycho Mantis twist that occurred, it was only in a recognition that the player is aware of their own hypermedia see that such a revelation could be designed. Similarly, Metal Gear Solid 2 uses this to create a duality between us and Raiden. Games like Undertale are also wickedly self-aware and are aware that we too are self-aware. It is in its play with expectation and subversion that it crafted its story. These might be the genesis of new avenues of double consciousness storytelling experiences, but disavowal of the immersive fallacy being a prerequisite for. Of course, this brings us to a new technological frontier that brings these issues to a head, VR. Here's a terminological distinction. Immersion is not presence. Presence is the feeling of actually being somewhere, and immersion is the pluralistic notion of engagement people have studied. VR consolidates a lot of these different strands of gaming history of the drive for representational fidelity, sensory immersion, simulated interfaces and hypermedia see. You would think this is the next evolution of the immersive fallacy, of our drive towards the holodeck, but VR is yet to be widely adopted. Maybe VR is making the same mistake as traditional games, focusing on the sensory dimensions of representation instead of recognizing what kinds of novel mechanical and storytelling experiences they can craft. They are less the Wii and Metal Gear Solid 1 or Flight Simulator, but more designed around our cultural obsession with sensory escapism. What's funny is that the technical demands of VR make it hard to create graphics that are anywhere near as high fidelity as traditional games, and so VR might be rejected because of the immersive fallacy once more. The immersive fallacy is a cultural meme that thinks itself at the forefront of innovation but is simultaneously reactionary and anti-technological and mired by its own conservatism. Wii gamers want innovation and new fangled hardware and cutting-edge particle effects, but also reject novel immersive experiences and platforms that might enable the very dreams that fuel a pervasive delusion. We reject attempts at peripheral innovation as casual or gimmicky or unfun, but also chastise the lack of innovation in a medium still searching for its identity, and most perversely, it traps us in the shadow of reality's image instead of allowing us to reconstruct that reality in an image of our own making.