Video game

Confessions of a Retired Catholic Gamer – Church Militant


You are not signed in as a Premium user; we rely on Premium users to support our news reporting.
Sign in or Sign up today!

In his 1981 seminal work Simulacra and Simulacrum, French post-modernist philosopher Jean Baudrillard diagnosed a phenomenon that had become a standard mark of the late industrial, electronic age.  

 

Baudrillard used the term “hyperreality” to describe the creation of a world of illusion detached from reality or “the generation by models of a real without origin or reality, which itself was, as Baudrillard would later develop, more real than reality itself.  

Baudrillard used the term ‘hyperreality’ to describe the creation of a world of illusion detached from reality.

As Baudrillard diagnosed, Westerners, especially in the United States, in the late 20th century found themselves escaping their own boring and comfortable post-World War II lives in what Baudrillard called “the desert of the real,” in order to enter into the world of television and cinema.   

For many people in what was rapidly becoming the “post-Christian” West, soap operas such as Days of Our Lives and General Hospital and sports teams like the L.A. Lakers and Detroit Pistons were more real than their own desiccated lives.  

For young people scavenging their way through the ruins of the last 20th-century, post-Christian West, there is an even more entrancing form of hyperreality: video games.   

Although video game have been available in electronic laboratories since the 1950s, massive commercial games first became available in the 1970s.  

However, with the release of the 1978 Space Invaders, a new phenomenon was born: the arcade. Many young people — especially those with a propensity to geekiness and social awkwardness — would flock to pizza parlors, bowling alleys and shopping malls to play classic games like Asteroids (1979) and Pac-Man (1980) 

While the stereotypical “geek,” adorned in the stereotypical “Clark Kent” glasses, made popular by Christopher Reeves in the 1978 film Superman, would regularly be found tapping away at  Centipede or Frogger, arcades were a social hub where young people could share a pizza and pitcher of root beer in between games of Joust, or meet up to drop a few quarters into Donkey Kong before going out for the evening.  


 

While, without a doubt, Catholics should look askance at the phenomenon of young people of both sexes meeting together without chaperones, the arcade was a fundamentally social place were people played video games together for a limited amount of time — usually until quarters were exhausted.  

Those who, by some financial wizardry or incredible skill, were able to play video games for hours on end were seen as odd and bizarre characters, and, one must admit, at times, such as the (now-suspect) retro arcade champion, Billy Mitchell, heroes.   

However, with the advent of home counsel systems such as the Atari Video Computer System (1977), the Nintendo Entertainment System (1985) and the Sega Genesis (1988), something began to change in the nature of video games 

While young people still crowded around the couch in the family den, basement or living room to play video games together, video games began to lend themselves to solitary extended play. No longer restricted by a pocket emptied of quarters or “closing time” at the pizzeria, one could play games like Zelda or Double Dragon for hours on end in the quiet comfort of one’s own room.  

While “video game addiction became a very real phenomenon to those glued to home consoles as well as the rapidly developing personal computer or “PC,” video games still did not have an all-engrossing effect on most people.  

There are three primary reasons for this.  

Video games of the 1980s and even well into the 1990s, while often beautifully designed, did not appear as hyperreality; that is, the graphics were not yet realistic enough to confuse the gamer into thinking that he was doing something in real life.  

Secondly, video games of the second, third, fourth and even fifth generation often were very difficult (much more so than many games today) and had a stigma of being the province of social outcasts. Sports and other real life activities were perceived as far superior to dwelling in one’s basement grinding one’s way through Super Mario World. Thus, intense gaming was still the province of intelligent people who had trouble functioning in the real world.  

Finally, video games throughout the 1980s and 1990s were very linear. They had a clear ending to them, and although their story lines and game design were largely far superior to later games in the 21st century, one could only “beat” a game so many times before it became boring.  

However, as America and the world rounded the 21st century, video games would radically change in their content and format and swallow large swathes of the people of the world. 

In 2019 nearly 50 years after their commercialization, video games are an immense culture force, raking in over $130 billion dollars in 2018 alone.  

This immensely profitable enterprise is primarily due to the fact that video games are no longer for nerds.  

In the 21st century, everyone plays video games, and they play them a lot — or at least a lot more than people used to play them.  

What is much more unsettling is the deleterious effects of many online games that, owing to their open source and open-ended nature, are constantly adding new quests and projects for gamers, making it impossible to “beat” the game once and for all.  

Perhaps the most notorious online game accused of causing a slew of deaths is World of Warcraft, or “WoW,” a Massively Multiplayer Online game (MMO) that enables the player to enter into a medieval fantasy world and perform a host of quests along with other people connected to one other via the internet.   

WoW addiction has allegedly caused a mother to starve her daughter, a young man to die of exhaustion after playing the game for 19 hours, a series of suicides and has even inspired the phenomenon of “Warcraft Widows,” or women who have, in effect, been abandoned by their husbands who play the game much more than they spend time with their spouse.  

World of Warcraft is not the only game that, owing to its addictive nature, has resulted in fatalities. 

Even the seemingly innocuous game Farmville had become so important to a New Mexico woman that she shook her three-year-old son to death when he interrupted her gaming.  

These situations are obviously extreme manifestations of 21st-century gaming addictions, and there are very likely other factors in these tragedies, including, most likely, mental health issues.  

However, these outlying extreme examples are not the problem with these contemporary games.  

Nor are the immodest (or even, some cases, explicitly pornographic), savagely violent and even explicitly satanic nature of many video games the primary problem with the phenomenon.  

Rather, it is normal, everyday gaming that poses at least a potential moral quandary for Catholics.  

Before we enter our moral critique of video games, your humble author will lay all his cards on the table.  

The author of this piece is himself a “retired gamer” who immensely enjoyed his gaming experience during the 1980s and 1990s Golden Age of console gaming. He was a gamer when being a gamer was considered a social anathema and has probably logged even more hours gaming in his life than the most diehard Fortnight fanatic or Skyrim devotee. 

Like the deeply and terribly morally flawed, tragic victim of the music industry, Michael Jackson, says in his 1987 hit song “Bad,” your author could very well tell any young person who claims to be a gamer, “You ain’t bad. You ain’t nothing.”

Secondly, your author is no moral authority, certified theologian or anyone’s spiritual director, and is not telling the readers of Church Militant whether they can or cannot play video games. He is just providing a layman’s opinion on the potential dangers of gaming.  

In doing so, however, let us turn to the Universal Doctor of the Roman Catholic Church, St. Thomas Aquinas.  

First of all, we know from Aquinas that recreation and amusement are licit and virtuous for a Christian.  

In Question 168 of the Second Part of the Second Part of the Summa Theologiae, that leisure is necessary for the soul, and “[j]ust as man needs bodily rest for the body’s refreshment, because he cannot always be at work, since his power is finite and equal to a certain fixed amount of labor, so too is it with his soul, whose power is also finite and equal to a fixed amount of work.”

Humans need not only to rest their body through sleep, but they, the Angelic Doctor argues, need to rest their soul through some form of leisure or “playing.”

In fact, quoting, Aristotle, St. Thomas states that there is a virtue associated with gaming called “eutrapelia,” or “pleasantness.”

However, St. Thomas Aquinas warns that games can be, in fact, injurious to the soul if certain requirements are not fulfilled.  

First of all, drawing from writings of the Roman orator Cicero, Aquinas argues that playing should “not be sought in indecent or injurious deeds or words.”

This caution should prompt Catholics to ask themselves some serious questions.  

Is pretending to rob, murder and sexually assault other humans in games like the Grand Theft Auto series a decent activity for Catholics to be doing?  

Is it not possibly injurious for a Christian to pretend to perform magic spells in RPGs like The Final Fantasy series?  

Secondly, Aquinas argues that when gaming, “we must be careful … to conform ourselves to persons, time and place, and take due account of other circumstances … .”

Is it fitting for a grandmother to spend hours puzzling her way through Candy Crush every day?  

Is it OK for a Catholic dad to ignore his wife and kids and pretend to be LeBron James in NBA 2K 2019 for hours on end? 

Should a Catholic priest be tapping away at a Nintendo Switch while his flock wanders into sin and perdition?  

Aquinas further notes that, as a virtue, eutrapelia requires that fun and games be “moderate.”

Aquinas further notes that, as a virtue, eutrapelia requires that fun and games be ‘moderate.’

In the case of video games, how many minutes or hours should one play per day?  

Should a high school or college student play 4–6 hours of Call of Duty every, or even most, days while he or she should be studying, exercising or preparing him or herself for courtship?  

As virtually every scientific study on the matter reveals, video games are by their very nature addictive and provide the illusion of success and a sense of accomplishment when the gamer has, in fact, just spent several hours doing absolutely nothing.   

One might wonder if it is possible, at all, to play video games in moderation.  

What we can gain from this Thomistic analysis is that gaming and recreation are potentially virtuous if done in moderation and performed in such a way that is appropriate and fitting to one’s station and age.  

The final question is, however, whether such virtuous gaming can be done in the hyperreal world of video games, or whether or not it is much better to engage in spiritual and political combat in the desert of the real, or what is better called, this “valley of tears.”

 

Have a news tip? Submit news to our tip line.


We rely on you to support our news reporting. Please donate today.





READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.