If you’re under 30—or have a child in their teens—then chances are you’ve heard of a game called Fortnite.
Fortnite, a “survival shooter” video game released last year, has quickly become one of the most popular online games in the world—a towering cultural phenomenon on a par with Minecraft or World of Warcraft. Earlier this year, it hit a record of 3.4 million users playing the game simultaneously, while its overall player base has reportedly topped 45 million.
Yet Fortnite’s huge success has been mired in controversy, reviving hysteria about video games. Ministers are apparently concerned over reports of the game inspiring violent behavior amongst children. Children’s Commissioner Anne Longfield recently described Fortnite as “irresponsibly addictive” and called for harsher regulation to deal with it.
New Game, Same Old Concern
Video games have long been a focus for moral panic.
This is nothing new. Video games have long been a focus for moral panic, with questions about their side-effects, “addictiveness” and alleged links with real-world sexism and violence. It is the latest in a long line of cultural products—including rock and roll, TV, and even the novel—to face such criticism.
Of course, with gaming, this takes on added weight. Because almost a third of gamers are under 18, there is a strong desire (as per Helen Lovejoy) to “think of the children.” But is it accurate to describe excessive gameplay as an “addiction,” or is it merely devaluing the concept to the point of meaninglessness?
Proponents of the addiction thesis often point to the feel-good hormones released during gameplay, claiming that, like cigarettes and drugs, video games cause rapid dopamine release. But this argument falls apart on closer inspection. Though it’s true that gameplay raises the brain’s dopamine levels, it does so by roughly the same extent as eating a slice of pizza—roughly a tenth of the amount released by ingesting cocaine or heroin.
The World Health Organization has recently categorized gaming addiction as a mental health condition for the first time, although scholarly opinion remains inconclusive, with many viewing “problem gaming” as merely symptomatic of existing mental health issues, rather than causative.
These debates are part of a wider trend towards pathologizing irresponsible or immoral habits—one that arguably risks stripping people of personal responsibility.
These debates are part of a wider trend towards pathologizing irresponsible or immoral habits—one that arguably risks stripping people of personal responsibility. It’s no coincidence that men caught up in scandals often, retrospectively, claim to be sex addicts (e.g. Tiger Woods, Charlie Sheen, Kevin Spacey). Most recently, Harvey Weinstein, who spent decades abusing women with little sign of remorse, checked into “sex-addiction rehab” when his actions were exposed.
Disease vs. Accountability
For people who are troubled by the scale or nature of their children’s gaming habits, the idea that they are suffering from some kind of disease may prove comforting. After all, the sick cannot be held fully accountable for their actions, and rebranding excessive gameplay as a collective health crisis arguably relieves parents of some of their own responsibility for monitoring their child’s behavior. And, of course, there are numerous groups and critics with a vested interest in the social construction of this panic.
But the significance of video game hysteria is heightened when it inspires thoughtless or draconian legislation. Already we are seeing this in recent government pledges to set limits on the time children can spend playing online games and using social media, though it remains unclear how these limits would be imposed in practice. Yet in doing so, we would be not only removing parental responsibility from the equation but following the lead of such pioneers of internet freedom as China. Is this really the direction we want to be going in?
Granted, video games can be extremely absorbing and immersive—but so can a good book. As a child, I used to sneak downstairs in the middle of the night to play my favourite game, The Legend of Zelda: Ocarina of Time, but I have certainly been “addicted” to books in the same way, reading Harry Potter for days on end, smuggling Narnia books under the dinner table. The same thing happened with the thriller Gone Girl a few years ago, and, even, I am ashamed to say, Dan Brown’s ludicrous but gripping The Da Vinci Code. It is not obvious what is inherently addictive about video games above and beyond other enjoyable pursuits.
Of course, a small proportion of people do game in a compulsive fashion, neglecting their studies, failing to turn up at work or engage with friends and family offline. And yet the number of gamers affected in this way is comparatively small. One recent wide-ranging study of around 19,000 young adults found that between 0.3 and 1 percent experienced video game “addiction.” This is surely true of a myriad of day-to-day activities; people frequently over-indulge in food, work, in religion, exercise, and all sorts of other pastimes—none of which has been codified as an official disease.
For the most part, gaming is normal behavior which may well prove a waste of time to some players but is rarely life-ruining or disruptive in the way drug or alcohol abuse can be.
For the most part, gaming is normal behavior which may well prove a waste of time to some players but is rarely life-ruining or disruptive in the way drug or alcohol abuse can be. We should be cautious before imposing new legislation that will affect 100 percent of gamers and erode internet freedom for the sake of a small minority of problem cases.
Moral panics rarely inspire thoughtful public policy, and there is a danger that legislating in haste may cause us to repent at leisure.