It would not be an overstatement to say that Nolan Bushnell, co-founder of Atari, helped set in motion events which have completely reshaped our culture, how we relate to it and to each other. In fact, when history looks back at the last half of the 20th century with its unforgiving gaze, that may even seem to be a gross understatement, since beyond establishing an entire industry and -- depending on how you look at it -- an entirely new art form, the achievements of Bushnell and his associates have in turn inspired other visionaries to transform how we work and, perhaps even more importantly, how we play.
Bushnell himself would probably be the first to acknowledge that subsequent generations of designers and programmers have expanded and improved upon his vision in ways he couldn’t have possibly imagined. He often credits his success to serendipity, to being in the right place at the right time, and also to standing on the shoulders of giants. Bushnell certainly didn’t invent computers, or even video games, but he saw a potential in them -- a place for them in our everyday lives that proved as revolutionary as the devices themselves.
The building blocks of what would become Atari’s Pong (video games’ first “killer app”) were, in technological circles at least, well known within a few years of Bushnell’s graduation from the University of Utah’s College of Engineering in 1968. The concept of computer gaming was introduced at Cambridge in 1952, when a PhD candidate named A.S. Douglas programmed a rudimentary tic-tac-toe game. The first tennis game was developed by Manhattan Project alum William A. Higinbotham, ironically after he observed that visitors to the U.S. nuclear research laboratory at Camp Upton army base on Long Island were bored by the machinery charged with harnessing the very power of the atom. The first coin-operated video gaming machine, Galaxy Game, was installed in the Stanford University student union in 1971.
But these were merely techie diversions and novel demonstrations of computers’ varied applications; what Bushnell did was to synthesize these ideas into a global phenomenon. Of course, as with anything, there were missteps. Along with his business partner Ted Dabney, Bushnell created Computer Space in 1971, the first commercially sold video game of any kind, but the arcade-style machine found little success beyond being featured as a futuristic signifier in the 1973 sci-fi thriller Soylent Green. Undeterred by the game’s failure, Bushnell and Dabney reinvested their miniscule profits from Computer Space, a paltry $250 each, into their stubborn dream of making video games as ubiquitous as pinball machines or pool tables. They started their own company, Syzygy, named after an astrological term meaning the earth, moon and sun in perfect alignment. Thankfully, the difficult to pronounce name was already registered in California by, of all things, a hippie candle-making collective, and Bushnell was forced to change it quickly, landing on Atari -- a word used in his favorite board game, go.
Bushnell and company initially planned to create a driving game for Atari’s first release, but, keeping things simple, they settled on updating existing tennis games with a few simple improvements, including having the ball gradually increase in speed. Thus Pong was born. Amazingly, their coin-op creation stirred little interest with entertainment and toy companies, including the pinball giant, Bally, who balked at the two-player nature of the game. But it was precisely this competitive setup, along with the endlessly challenging nature of the gameplay, which led the first jerry-rigged machine to be installed in Andy Cap’s Tavern in Silicon Valley, and -- shortly thereafter -- to become such a mind-boggling success. Tellingly, the first service call Atari received was not to fix the machine itself, but to come and empty the thousands of quarters, which had filled the allotted space and had started backing up into the coin slot, causing the game to malfunction.
With the business potential of arcade games definitively proved, sales of Pong took off, establishments dedicated to these types of games began to spring up, and arcades became meccas for a new breed of geek. Bushnell, who had experience working in amusement parks, got into the action by starting the Chuck E. Cheese’s restaurant chain, and though he explains the decision in terms of giving kids a place to just be kids, one could look at the venture cynically, figuring that as long as kids are plugging coin after coin into these games, he might as well buy the building and sell them bad pizza while they’re at it.
The next market Atari attempted to conquer, the home console, would, by its very nature, spell doom for the arcade, but its prescience also helped kick start the personal computer craze that would soon revolutionize communications and culture. As with the Pong arcade machines, the Atari 2600, first released in 1977, was not the first home gaming console, a distinction which belongs to the Magnavox-Odyssey, but it did take the concept to the next level. In contrast to the Odyssey, which came with all of its games preprogrammed, the 2600 contained its own microprocessor, which in turn worked with the hardware contained in individual game cartridges. This meant that not only would the Atari run faster, the number of games that could be created and then sold individually was limitless, providing variety for users and a continuing revenue stream for the company.
For many young people of the time, the Atari 2600 provided their first experience with computers, inspiring curious minds to take it apart and figure out how it worked, and crucially, how to make it better. At the same time that the product itself was making waves, Bushnell’s desire to nurture talent was also shaking things up. Steve Wozniak and the late Steve Jobs, both of whom had worked on the Atari arcade hit Breakout, created and launched the pioneering Apple II home computer, with assistance and parts provided by a few full-time Atari employees. Atari itself followed suit with their 8-Bit Family of PCs, but it was the fledging Apple that hit on a technological and commercial breakthrough, a useful, relatively user-friendly device that paved the way for a communications renaissance that put computers in most homes and helped spawn a myriad of related innovations, chief among them the internet.
Yet as Atari’s technical leaps helped computing flourish into the seemingly indispensable part of our lives it has now become, the cultural impact of the company is equally as vast. The kids who grew up lurking in arcades or disassembling their consoles, particularly in the 1980s, went on to create not just more powerful hardware, but also software (i.e. games) of ever increasing complexity and diversity. Today, around two-thirds of American households have at least one gamer, and contrary to popular belief, they’re not just kids; two-thirds of those fans are above the age of 18. That is a giant segment of the population, with varying tastes and interests, and the variety of game types has multiplied exponentially in order to satisfy them, from simple chess programs to retro side-scrollers to massive multiplayer online games (MMOs) in which thousands of people work cooperatively as well as engage in battle.
The staggering number of different games, and the sophistication of their execution, has led to the hotly debated issue of whether or not video games should be considered art. It is worth noting that many theorists and academics only became interested in video games as higher processing speeds and memory brought better graphics and with them the trappings of film, such as cinematic landscapes and complicated story lines. That they never thought to examine chess or Bushnell’s beloved go in a similar way shows that their interest lies mainly in surfaces, not what already made playing games an intellectually challenging and emotionally thrilling experience. Accepted forms of art, from cinema to literature, are presented to the audience as works, things with a beginning, middle and end, but games, any type of games, don’t have audiences, they have players, who in accordance with certain rules and parameters, must act as well as think and consider. There are a few “games”, such as Keita Takahashi’s Noby Noby Boy (which I would call art simply because I don’t know what the hell else to call it), which lack any real objective whatsoever, but by and large, they are concerned with winning, whether by besting an opponent or completing some task. As Roger Ebert controversially asked, “Why are gamers so intensely concerned, anyway, that games be defined as art? Bobby Fischer, Michael Jordan or Dick Butkus never said they thought their games were an art form,” which is an admittedly prickly way of asking why, when games in their own right are such an ancient, wonderful and complex part of the human experience do they need to be art at all?
But were it not for Nolan Bushnell, video games would likely never have become the subject of such dizzying discourse, and it also seems a little unlikely that the laptop I’m writing these words on, or the browser you’re viewing them in, would look at all familiar were it not for the cutting edge spirit of the company he worked so hard to get off the ground. Bushnell and Atari, as well as their forefathers and the people who built on their legacy, deserve our respect and admiration, and thankfully it’s easy to show it. They did all the heavy lifting; all you have to do is press start.