Monday, August 08, 2022

Apology for Video Games Research

I just finished reading this excellent history of early digital computing, disguised as a biography of computing researcher and visionary J. C. R. Licklider. One of the things that the book drove home was the pushback, skepticism, and even hostility you faced if you wanted to work on things such as interactive graphics, networking, or time-sharing in the early decades of digital computers. In the fifties, sixties, and even seventies, the mainstream opinion was that computers were equipment for serious data processing and nothing else. Computers should be relatively few (maybe one per company or department), manned by professional computer operators, and work on serious tasks such as payrolls, nuclear explosion simulations, or financial forecasting. Computing should happen in batch mode, and interactive interfaces and graphical output were frivolities and at best a distraction.

In such an environment, Licklider had the audacity to believe in a future of interconnected personal computers with interactive, easy-to-use graphical interfaces and fingertip access to the world's knowledge as well as to your friends and colleagues. He wrote about this in 1960. Through enthusiasm, smart maneuvering, and happenstance he got to lead his own research group on these topics. But more importantly, he became a program manager at the organization that would become DARPA, and not only directed tons of money into this vision of the future but also catalyzed the formation of a research community on interactive, networked computing. The impact was enormous. Indirectly, Licklider is one of the key people in creating the type of computing that permeates our entire society.

When I go out and talk about artificial intelligence and games, I often make the point that games were important to AI research since the very beginning. And that's true if we talk about classical board games such as Chess and Checkers. Turing, von Neumann, and McCarthy all worked on Chess, because it was seen as a task that required real intelligence to do well at. It was also easy to simulate, and perhaps most importantly, it was respectable. Important people had been playing Chess for millennia, and talked about the intellectual challenges of the game. And so, Chess was important in AI research for 50 years or so, leading to lots of algorithmic innovations, until we sucked that game dry.


Video games are apparently a completely different matter. It's a new form of media, invented only in the seventies (if you don't count Spacewar! from 1962), and from the beginning associated with pale teenagers in their parents' basements and rowdy kids wasting time and money at arcade halls. Early video games had such simple graphics that you couldn't see what you were doing, later the graphics got better, and you could see that what you were doing was often shockingly violent (on the other hand, Chess is arguably a very low-fidelity representation of violence). Clearly, video games are not respectable.

I started doing research using video games as AI testbeds in 2004. The first paper from my PhD concerned using a weight-sharing neural architecture in a simple arcade game, and the second paper was about evolving neural networks to play a racing game. That paper ended up winning a best paper award at a large evolutionary computation conference. The reactions I got to this were... mixed. Many people felt that while my paper was fun, the award should have gone to "serious" research instead. Throughout the following years, I often encountered the explicit or implicit question about whether I was going to start doing serious research soon. Something more important, and respectable, than AI for video games. 

Gradually, as a healthy research community has formed around AI for video games, people have grudgingly had to admit that there might be something there after all. If nothing else, the game industry is economically important, and courses on games draw a lot of students. That DeepMind and OpenAI have (belatedly) started using games as testbeds has also helped with recognition. But still, I get asked what might happen if video games go away: will my research field disappear then? Maybe video games are just a fad? And if I want to do great things, why am I working on video games?

Dear reader, please imagine me not rolling my eyes at this point.


As you may imagine, during my career I've had to make the case for why video games research is worthwhile, important even, quite a few times. So here, I'll try to distill this into not-too-many words. And while I'm at it, I'd like to point out that the "apology" in the title of this text should be read more like Socrates' apology, as a forceful argument. I'm certainly not apologizing for engaging in video games research. For now, I will leave it unsaid whether I think anyone else ought to apologize for things they said about video games.

To begin with, video games are the dominant media of the generation that is in school now. Video games, for them, are not just a separate activity but an integrated part of social life, where Minecraft, Roblox, and Fortnite are both places to be, ways of communicating, and activities to do. Before that, two whole generations grew up playing video games to various extents. Now, studying the dominant media of today to try to understand it better would seem to be a worthwhile endeavor. Luckily, video games are eminently studiable. Modern games log all kinds of data with their developers, and it is also very easy to change the game for different players, creating different "experimental conditions". So, a perfect setting for both quantitative and qualitative research into how people actually behave in virtual worlds. While this ubiquitous data collection certainly has some nefarious applications, it also makes behavioral sciences at scale possible in ways that were never before.

People who don't play games much tend to underestimate the variety of game themes and mechanics out there. There are platform games (like Super Mario Bros), first-person shooters (like Call of Duty) and casual puzzle games (like Candy Crush)... is there anything else? Yes. For example, there are various role-playing games, dating simulators, flight simulators, racing games, team-based tactics games, turn-based strategy games, collectible card games, games where you open boxes, arrange boxes, build things out of boxes, and there's of course boxing games. I'm not going to continue listing game genres here, you get the point. My guess is that the variety of activities you can undertake in video games is probably larger than it is in most people's lives.

To me, it sounds ridiculous to suggest that video games would some day "go away" because we got tired of them or something. But it is very possible that in a decade or two, we don't talk much about video games. Not because they will have become less popular, but because they will have suffused into everything else. The diversity of video games may be so great that it might make no sense to refer to them as a single concept (this may already be the case). Maybe all kinds of activities and items will come with a digitally simulated version, which will in some way be like video games. In either case, it will all in some  ways have developed from design, technology, and conventions that already exist.

In general, it's true that video games are modeled on the "real world". Almost every video game includes activities or themes that are taken from, or at least inspired by, the physical world we interact with. But it's also increasingly true that the real world is modeled on video games. Generations of people have spent large amounts of their time in video games, and have learned and come to expect certain standards for interaction and information representation; it is no wonder that when we build new layers of our shared social and technical world, we use conventions and ideas from video games. This runs the gamut from "gamification", which in its simplest form is basically adding reward mechanics to everything, to ways of telling stories, controlling vehicles, displaying data, and teaching skills. So, understanding how video games work and how people live in them is increasingly relevant to understanding how people live in the world in general.


The world of tomorrow will build not only on the design and conventions of video games, but also on their technology. More and more things will happen in 3D worlds, including simulating and testing new designs and demonstrating new products to consumers. We will get used to interacting with washing machines, libraries, highway intersections, parks, cafés and so on in virtual form before we interact with them in the flesh, and sometimes before they exist in the physical world. This is also how we will be trained on new technology and procedures. By far the best technology for such simulations, with an unassailable lead because of their wide deployment, is game engines. Hence, contributing to technology for games means contributing to technology that will be ubiquitous soon.

Now, let's talk about AI again. I brand myself an "AI and games researcher", which is convenient because the AI people have a little box to put me in, with the understanding that this is not really part of mainstream AI. Instead, it's a somewhat niche application. In my mind, of course, video games are anything but niche to AI. Video games are fully-fledged environments, complete with rewards and similar incentives, where neural networks and their friends can learn to behave. Games are really unparalleled as AI problems/environments, because not only do we have so many different games that contain tasks that are relevant for humans, but these games are also designed to gradually teach humans to play them. If humans can learn, so should AI agents. Other advantages include fast simulation time, unified interfaces, and huge amounts of data from human players that can be learned from. You could even say that video games are all AI needs, assuming we go beyond the shockingly narrow list of games that are commonly used as testbeds and embrace the weird and wonderful world of video games in its remarkable diversity.

AI in video games is not only about playing them. Equally importantly, we can use AI to understand players and to learn to design games and the content inside them. Both of these applications of AI can improve video games, and the things that video games will evolve into. Generating new video game content may also be crucial to help develop AI agents with more general skills, and understanding players means understanding humans.


It is true that some people insist that AI should "move on" from games to "real" problems. However, as I've argued above, the real world is about to become more like video games, and build more on video game technology. The real world comes to video games as much as video games come to the real world.

After reading this far, you might understand why I found reading about Licklider's life so inspirational. He was living in the future, while surrounded by people who were either uninterested or dismissive, but luckily also by some who shared the vision. This was pretty much how I felt maybe 15 years ago. These days, I feel that I'm living in the present, with a vision that many younger researchers nod approvingly to. Unfortunately, many of those who hold power over research funding and appointments have not really gotten the message. Probably because they belong to the shrinking minority (in rich countries) who never play video games.

I'd like to prudently point out that I am not comparing myself with Licklider in terms of impact or intellect, though I would love to one day get there. But his example resonated with me. And since we're talking about Licklider, one of his main contributions was building a research community around interactive and networked computing using defense money. For people who work on video games research and are used to constantly disguising our projects as being about something else, it would be very nice to actually have access to funding. Following the reasoning above, I think it would be well-invested money. If you are reading this and are someone with power over funding decisions, please consider this a plea.

If you are a junior researcher interested in video games research and face the problem that people with power over your career don't believe in your field, you may want to send them this text. Maybe it'll win them over. Or maybe they'll think that I am a total crackpot and wonder how I ever got a faculty job at a prestigious university, which is good for you because you can blame me for the bad influence. I don't care, I have tenure. Finally, next time someone asks you why video games research is important, try turning it around. Video games are central to our future in so many ways, so if your research has no bearing on video games, how is your research relevant for the world of tomorrow?

Note: Throughout this text I have avoided using the term "metaverse" because I don't know what it means and neither do you.

Thanks to Aaron Dharna, Sam Earle, Mike Green, Ahmed Khalifa, Raz Saremi, and Graham Todd for feedback on a draft version of this post.

1 comment:

Karim said...

Thank you for this :) I joke about how every paper that I write needs to start with a "Why you should care about this" section, and your blog post looks very much like what I usually write.