To say that the release of Cyberpunk 2077 was a disaster would be an understatement. From the moment the game launched on December 10th of last year, it was abundantly clear that the game was so buggy, it was nearly unplayable. Some glitches were only cosmetic problems, like NPCs randomly T-posing or trying to put items away in non-existent pockets. Others caused actual gameplay issues, like the AI drivers being so bad that cars randomly went flying, or players getting inexplicably trapped inside vending machines. For a couple of weeks after the game’s release, one of my friends complained that their character was always falling through the floor during important moments. And of course, the game straight-up crashed constantly, regardless of what system it was played on.
Within days, so many refunds were demanded that XBox and Sony actually changed their refund policy to allow anyone to get their money back for the game, regardless of how many hours they’d already played it for. But Cyberpunk 2077 was more than just a waste of money. It was a let down to anyone who bought into the hype that the developer, CD Projekt Red of the Witcher series, spent years fabricating. This was especially true for fans of the cyberpunk genre or the titular tabletop series the game is based on, both of which, many complained, that it failed to live up to.
Cyberpunk as a genre is not just about high-tech dystopian futures where you can replace your body parts with machines. Since the game was first announced in 2012, it was built up as a hard hitting, immersive, open-world sci-fi RPG that would supposedly be unlike any game ever made. The result was a buggy, fairly mediocre game with a fairly mediocre story. While Cyberpunk 2077 may have fit the genre aesthetically, it failed to really address any of the major themes of cyberpunk — wealth inequality, changing social order, the inevitable degradation of capitalism, etc. This is sort of a self-fulfilling prophecy, since no game with a budget of 313 million USD was ever going to come out and loudly declare that rich people suck, but it was disappointing for fans nonetheless.
On top of being a let down story-wise, the game also failed on an ethical level. Despite multiple promises from executives that they wouldn’t require developers to work overtime, Cyberpunk 2077 employees still fell victim to crunch. “Crunch” refers to a period of weeks or months, sometimes years, in which developers are forced to work more than 40 hours a week, usually towards the end of a game’s development cycle. In the months leading up to the game’s release, the studio was reportedly making workers clock in six days a week. During production of the company’s last game, The Witcher 3: Wild Hunt, working conditions were supposedly even worse, with management being consistently inflexible and unsympathetic about similar schedules for months longer. In a bit of an ironic twist, CD Projekt Red has come to resemble the evil corporations portrayed in its game.
Unfortunately, crunch is less than unique to CD Project Red.
BioWare, a video game developer known for series like Mass Effect and Dragon Age, is a great example of what crunch has done to the video games industry. To summarize about a decade of video game history, the studio suffered from poor management and almost always relied on crunch to get their games out on time. This worked really well–until it didn’t. Their last two games, Mass Effect: Andromeda and Anthem, both received mixed reviews from critics.
For many people in the video game industry, games are not just a job, but a passion. Developers are more than willing to take advantage of that passion to make employees work under inhumane conditions, but it’s far from necessary. As companies like BioWare are beginning to show, crunch can’t be used as a substitute for good management. It is more than possible to produce games without overworking staff, just like it’s possible to produce any other product ethically. There’s no reason to continue rewarding games for mistreating the people who worked on them.
This brings us to The Last of Us Part II, the winner of, among many other accolades, The Game Awards’ Best Game Direction. This decision was criticized by multiple publications, including Forbes and Kotaku, because as you may have guessed, The Last of Us Part II involved a lot of crunch time. This is not a disqualifying factor for most awards, but it should have been for Best Direction. Even during non-crunch times, Naughty Dog, the video game studio behind the Uncharted and The Last of Us series, is known for asking employees to put in unreasonable amounts of time. Twelve-hour workdays and extra weekend shifts are fairly normal.
The studio also doesn’t have a production department to keep things organized and on schedule. Instead, Naughty Dog’s philosophy is that everyone should be their own producer, which is a nice idea to reduce bureaucracy, but doesn’t help with delivering products on time. But instead of hiring people to manage their projects properly, they just make everyone stay later. And yet, this game won an award specifically designated for a project with notable leadership.
Naughty Dog’s lack of compassion for their employees is not a secret. But when it came time for the 30 odd video game news organizations that are selected to decide the Game Awards to vote, they decided that the game with the best leadership and management was The Last of Us Part II. The message here is clear: It doesn’t matter what you do to your employees, so long as the product is good.
This becomes even more pointed when you look at some of the other games that were up for Best Direction, like Hades, developed by Supergiant Games. Supergiant, known for critically acclaimed games Bastion, Transistor, and Pyre, made headlines last year for detailing the anti-crunch policies they used during the production of Hades. Among other things, the studio gives employees unlimited time off, a MINIMUM of 20 days off each year, and does not allow work emails past 5pm on Fridays.
Hades is a rogue-like action RPG where you play as Zagreus, the son of Hades, and try to fight your way out of the Underworld. A rogue-like is a specific subgenre of video games where the player must traverse a series of procedurally generated levels, and must restart upon death. Think Binding of Isaac or Pokemon Mystery Dungeon. Hades makes a great addition to the genre. Each playthrough, or “run,” the player has one magical attack and a weapon with two attacks. The weapon can be switched out between runs. By presenting the player with different choices of “boons” from various Olympians, the game manages to make each run feel unique. During one run the player might put together a defensive build using the shield and a boon from Athena that gives you the ability to deflect enemy attacks, and during another the player might pick up a Zeus boon that means all they have to do is run around the map and all their enemies will be struck dead by lightning. The game strikes just the right balance of random generation to make each run different without it being entirely out of the player’s control.
Supergiant specifically decided to make a rogue-like because they wanted to experiment with procedural narrative storytelling. In order to move forward with the story, players have to not only progress further into the Underworld, but they complete a certain amount of runs. I use the word “complete” loosely here. Hades is a game that expects the player to lose, and frequently. In fact, each time the player dies, they get a death screen that specifically says, “THERE IS NO ESCAPE.” But each time you die, you get to spend in-game currency, progress your relationships with NPCs, and move the story forward. Somehow, Supergiant Games even managed to make losing fun. There’s also an easy mode, dubbed “God Mode,” that can be turned on at any point in the game for people who struggle with the gameplay but still want to move on with the story.
Hades has not suffered from lack of crunch. If anything, it has thrived. Since its release via early access in 2018, the gameplay, graphics, and story have been praised by critics. The studio specifically chose an early access model because they wanted to have more player feedback during its development, a decision that has had a significant impact on its gameplay. When the game officially came out in September of 2020, it was met with universal acclaim.
The game credits, not counting cast, list about four dozen people, less than half of which are employed full-time with Supergiant Games. Compare this to the 500 people that were working on Cyberpunk 2077 at the time of its launch. Hades made it onto nearly every Games Of The Year list of 2020, and Cyberpunk 2077 was barely more than a meme. Clearly, the number of people putting in overtime isn’t what makes a game worth playing.
Although the greater software development field certainly has issues with work-life balance and expecting too much out of their employees, the video game industry treats crunch time like a crutch. Studios like BioWare and Naughty Dog throw the basic principles of software planning and management out the window because they know that if worse comes to worse, they can just ask everyone to stay late for the next six months. But Hades quickly proves that none of that is necessary. With the right amount of proper planning and user feedback, it doesn’t take the suffering of workers to make a beloved game.
I know that I’m striking a bit of a false equivalence between a game like Hades, a 2D hack-and-slash, and a game like Cyberpunk 2077, a 3D open-world with supposedly endless customization options. But I’ve only had Hades for a month and a half, and so far I’ve logged more hours in it than my three friends who played Cyberpunk 2077 have in that game combined. Studios need to stop prioritizing how big or complex a game will be over how playable it will be. More than that, they should stop prioritizing how much money a video game could potentially make over how much their employees have to suffer.
On every level, people should be doing more to oppose the exploitation of workers in the video game industry. There needs to be a fundamental culture shift. Crunch should not be considered the norm. The news outlets that make up the Game Awards selection committee should not be praising the direction of games that don’t respect their employees over ones that do. Workers should know their worth, and understand that no matter how passionate they may be about a project, they should not allow themselves to be forced or coerced into unreasonable working conditions. There also needs to be more large-scale unionization efforts among video game workers, so they have the power to ask for the benefits they deserve.
As for the players, I strongly believe that the burden of ethical production should not be solely on the consumer. After all, there is no purely ethical consumption under capitalism, and treating consumerism as activism can be actively harmful to many causes.
Comments