“All right, this is going to be amazing,” I told myself minutes before midnight on the eve of the Halo: The Master Chief Collection release day. My body was filled with Red Bull, Doritos and hype as I was ready to plunge into nostalgic goodness. I loaded up the multiplayer mode immediately upon starting the game, and I waited for a game and waited and waited. “It must just be server overload. Let me just play some campaign and then come back,” I consoled myself. That would soon be a common response over the next couple of days.
Now a week later, the online multiplayer component of Halo: The Master Chief Collection is still nonfunctional. On top of the atrocious wait times and matchmaking issues, when you do in fact get into a game, it rarely plays out like it should. A 4-on-4 Team Slayer match randomly splits everyone into separate teams, friends in a party get put on opposite teams and players in first place are deemed as “1th.” In every sense of the word, Halo: The Master Chief Collection is an utter disaster. The launch has quickly turned from a time of celebration to a time of severe annoyance.
The most agonizing thing about this whole situation is that it isn’t unique to just Halo: The Master Chief Collection. In October, PlayStation 4’s flagship driving game Driveclub was littered with network issues, and even a month later, it still doesn’t work properly. Call of Duty: Advanced Warfare, the latest entry in the largest gaming franchise in history, launched without dedicated servers even though they promised them beforehand. The resulting network structure of player-to-player connections created frustrating game play because of the stifling lag. The latest offender is Assassin’s Creed: Unity, a game with tremendous numbers of bugs and optimization issues.
These problems beg the question, “What the hell is happening with these developers?” We are now in the year 2014, with brand new next-gen consoles a year in full force and even more problems than ever before. How is it that these games coming from full-fledged developers with stacks of cash are not able to fully optimize and test their games?
There are a couple of scenarios that I could see as the culprits here. One possibility is that there just wasn’t enough time to test for bugs or refine the systems of the game. Developers have to meet a release date, and more often than not, they are rushed to put out a finished product. There needs to be more time allocated to these periods of quality assurance to make sure that the games they’re shipping aren’t broken. Some game developers are better about this than others by taking hefty delays to ensure that the experiences will run smoothly, but in some circumstances such as Driveclub, which was delayed an entire year, it still doesn’t cut the cheese.
Another possibility — one that is quite more cynical — is that these developers are completely aware of the states of their games but that the corporate office demands a shipped product regardless. The developers may say that the games can be patched afterwards, and that’s all the publishers need to hear. As long as the games gets shipped, people will buy them impulsively due to the mass emphasis on preorders before actively knowing how the games play. I would like to think that the publishers care about the overall reception of their games, but with huge fan followings behind franchises like Call of Duty and Assassin’s Creed, the publishers know that people will buy them anyway.
My biggest gripe with the problems in these games is that, time after time, games with massive hype release and the servers become overloaded. It has been three years now since the Diablo 3 “Error 37” debacle where no one could play on launch day. How is it still possible that developers continue to bungle the expected server load on launch day? I’m no network engineer, but it feels like the solution seems obvious in that you overshoot the expectations of the number of players on launch day and then you appropriately gauge how much server activity you actually need.
Another obvious solution is to run a beta test for server strength. While this doesn’t always result in a perfect launch, it’s a significant step in making sure that whatever problems arise are limited. Destiny had one of the smoothest launches for as big of a game as it was, and it can be largely attributed to its wildly successful beta phase. One of the best things that a developer can do is simply delay the game. It’s a painful solution in the short term for the company, but as Shigeru Miyamoto said, “A delayed game is eventually good; a bad game is bad forever.”
In the modern-day world of gaming, it is becoming harder and harder to forgive developers for releasing broken products. New consoles have just released in the last year, yet it feels like developers are regressing in polish and care for their games. Ten years ago, I played Halo 2 online on launch day with no problems, yet now with massively better technology, I can barely connect to a game in Halo: The Master Chief Collection. This is entirely unacceptable, and the consumers are now the ones paying the price. I now ask, I beg, I plead of you developers: fix your damn games!
1 Comment
Took Creative Assembly over a year to make Rome II: Total War adequately optimized. Not even free downloadable content throughout the process was enough to tide over a lot of fans. I refuse to buy games day-one anymore since just because I expect them to be unfinished and buggy as hell