Over the past 40 years, video games have grown into an enormous multi-billion dollar entertainment medium. In this three-part series, we look first at the videogame industry and where it may be heading; second, we talk to veteran game designer Jordan Mechner about the changes he has seen in the industry since the Eighties and his vision of gaming’s future; and finally, in the third section, we examine the growth of serious games—videogames designed with an educational or therapeutic purpose—and their potential to change society.
Part 1: State of play
In the early seventies, mechanical pinball games still dominated the arcades, as they had since the Great Depression. Then in 1972, Allan Alcorn, a 24-year-old designer in Sunnyvale, CA, developed a two-dimensional electronic table tennis game and installed it in a local bar.
Pong was an unexpected hit at Andy Capp’s Tavern, and soon became so popular that it spawned an entirely new entertainment medium, the videogame. As Devin Griffith put it in his 2013 history of the videogame, Virtual Ascendance, “[t]hree vertical lines and a bouncing square of light. That’s all it took to change the world.”
Flash forward 45 years and annual videogame revenue now exceeds $100 billion.
People play videogame on special consoles, on computer screens, on television screens, and on their smartphones. They play alone, they play with friends, they play with thousands of other people simultaneously all over the world.
Videogames have infiltrated more serious parts of life as well. Children learn math and chemistry playing videogames. Medical students practice surgery and pilots learn to fly, all using videogames. Businesses train their employees and their customers using “gamification” techniques first perfected by game developers.
The technology is bleeding into real life in a second way as well, as some of yesterday’s child gamers have grown up to be soldiers who launch real-life laser canons with Xbox controls or pilot drones with joysticks. “It’s a lot like playing a videogame,” a veteran Predator drone operator has been quoted as saying, “but playing the same videogame four years straight on the same level.”
With so much riding on games, universities have begun offering specialized courses in game-making and even on game criticism. These days, students earn degrees in game design, even as cultural scholars write books and papers that critique what this new form of play says about our society and what it may be doing to us.
The third level
Initially, the biggest-selling games were titles programmed to run on specific game consoles, Nintendo followed by several generations of Sony’s PlayStation and Microsoft’s Xbox.
Such high-budget blockbusters are still an important part of the industry: the top-selling game on Xbox, Grand Theft Auto V, has sold 90 million copies since its release in 2013, earning the developer Take Two $6 billion, and making the game the single most profitable media title in history, with double the inflation-adjusted receipts of Star Wars or Gone with the Wind.
However, the rise of broadband has helped open up a second, even larger, level of opportunities for game developers. “Suddenly, you have all these people who would never spend $70 for a game or $700 for a console playing games where those nerdy subculture barriers of entry are now gone, and now everyone can play Angry Birds or Words with Friends,” says Brendan Keogh, a Brisbane-based videogame critic and research fellow in the Digital Media Research Centre at Queensland University of Technology.
“You didn’t need to put a CD or a cartridge in a shop to sell your games, so smaller developers could independently release their games directly to people, which opened up room for more niche developers creating for more niches, as the Internet did for music and film – and everyone, really,” continues Keogh.
At the same time as the distribution barrier fell, the creation of sophisticated game design middleware made creating videogames much easier. The Unity game development platform in particular, Keogh says, revolutionized game creation – particularly as the software is free for any developer whose game revenue is under $100,000 a year.
These changes meant that it no longer took a large team to create a professional-level videogame anymore. Keogh estimates that most independent game studios have between six and 12 team members, mostly programmers or artists, but even solo developers are not unknown.
Not surprisingly, as barriers to entry fell, game production rose. Today, it has reached levels that some observers think may be unsustainable. In February, for example, more than 40 new games were posted every day on the Stream game platform, according to Felan Parker, an Assistant Professor of Book & Media Studies at the University of St. Michael’s College at the University of Toronto.
For the most part, game development is a winner-take-all business. “There isn’t a middle class of game developers. You’re either one of the lucky few who broke out and made it or you are one of the many who are struggling to get by,” Parker says.
As with the movie business, a few high-budget blockbusters created by teams of hundreds still attract the largest share of critical attention and highest sales. Smaller teams of independent developers, meanwhile, contribute the occasional surprise hit (for example, various iterations of Rovio’s Angry Birds have been downloaded more than 2 billion times, according to the Finnish developer), but more often independents serve the major studios as a kind of extended research and development laboratory.
These indie game developers have pushed a variety of boundaries. Some, for instance, have experimented with the structure of the story’s narrative, Keogh says, pointing to such innovative titles as Gone Home, a detective story that focuses on unlocking the secrets of a house in rural Oregon; Her Story, an interactive videogame about a woman being interrogated by police investigators; and Depression Quest, a game about living with depression.
Welcome to Level 3
Some observers argue that the current flood of games is unsustainable and the indie apocalypse is nigh, but Keogh is skeptical.
“My own theory is that it’s just going to become more and more fragmented,” he says. Instead of a great winnowing, he expects an evolution along the lines of television, where in place of three or four networks an ever-growing number of stations and services now compete for the viewers’ attention.
What this next generation of games will be like is unclear.
On the one hand, Benn Konsynski, George S. Craft Distinguished University Professor of Information Systems and Operations Management at Emory University’s Goizueta Business School, for one, sees more educational and immersive experiences ahead.
Certainly, new technologies are creating new opportunities. For instance, Pokémon Go, a location-based, augmented reality game that enabled players to hunt for Pokémon cartoon characters in real places, became a worldwide fad in 2016, attracting more than 750 million downloads.
On the other hand, Robin Teigland, a professor at the Center for Strategy and Competitiveness at the Stockholm School of Economics, sees a trend away from games that are either complex or high-tech.
Massive multiplayer games such as World of Warcraft have not evolved into the kind of parallel universe once forecast. Games with a physical component, such as games built on the Wii platform, have also not turned out to be the killer app. Virtual reality games too always seems to be something that is perpetually on the way, Teigland says.
Instead, she believes, the industry will produce simpler games. “They can definitely program complex and more interesting games, but what does the market want? The market wants Candy Crush, it seems,” Teigland says.
If Teigland is right and most people want to play what she calls the gaming equivalent of Snapchat, how will the more advanced gaming technology be used? One possibility is a new genre of simulation that has grown up over the past two decades – the serious game.