Your game needs automated testing
A fascinating piece of news cropped up on Friday, hitting major games and tech publications before spreading across social media. (If you're already familiar with the story, feel free to skip ahead to the lesson.) For context, 'Aliens: Colonial Marines' was subject to controversy and delays throughout its development. When Gearbox finished it and Sega put it on shelves in February 2013, critics widely panned the game: pointing out issues with story telling, ill-fitting mechanics, and this, as PC Gamer put it:
Oof. If you make an Aliens game, you'd better get the aliens right.
Since then, a die-hard group of modders have been trying to rescue the game. Friday's news was the discovery of an apparent single character typo in a configuration file that may be almost entirely responsible for the poor AI behavior.
Here's jamesdickinson963 on ModDB explaining the gaffe:
This config entry attempts to set the AI behavior, but code for
doesn't exist, code for
AttachPawnToTether does. Now you can make all kinds
hay about this, and plenty already have, but my intention is not to pick on the fine folks
at Gearbox. I'm here to make a larger point about games development practice, and how we
games engineers can do better. A mistake is a mistake, but refusing to learn from one is
the real failure. Without further ado...
You need automated testing in your game.
I almost feel silly saying this, because the value of automated testing is already accepted in other software domains. (Not everyone does it, but basically everyone knows they should.) Seems like most game developers I talk to can't seem to agree that there's any point to it. Or, "sure, we could do that, I guess, but it would be a lot of work for little payoff and I've got thirty other high-priority items to tackle before Big Scary Deadline in two weeks."
I would suggest that launching with not-completely-broken AI is high priority.
Because here's the thing: an automated test to check the validity of a configuration file should be entirely straight-forward to write. Validating syntax and checking for the existence of class names and other proper nouns is what every programming language does. It's a well-understood problem with perfect solutions. And when you look at the potential payoff here—saving entire swaths of your game from being disabled by a typo—the cost-benefit analysis is ridiculously in favor of testing. (This isn't even a super-rare problem, Civilization VI suffered from a similar config file typo.) This is the enlightened pratice: find things that are reasonable to test, where an error would be costly or distracting or frequent, and write a test for it. It will never break silently again.
But keep in mind that tests can't grow without the right environment. In order to be effective, in order to see where best to apply them in the first place, automated testing has to be part of your culture. And you should start practicing with these tools before something catches on fire.
Would preventing this error have saved the game's critical and commercial success? Unfortunately we'll never know. AI wasn't the only problem, but I can imagine players brushing aside some of the other faults if only the gameplay were better. Sometimes you just want to frag some xenos, after all.
Did Gearbox developers have any reason to foresee the risk they were taking? No, probably not. I can imagine a developer saying: "surely we'll notice if this config is borked." But of course we know they didn't. Maybe this error crept in at just the last minute, when developers and QA were in the launch frenzy. Or maybe they did notice, but spent hundreds of work-hours failing to track down the problem before time ran out, never thinking to check the humble config file. The errors that automated tests save us from tend to be the errors we'd never expect. That is precisely why they are so valuable.
We humans are not as perfect as we think we are, especially when the pressure is on. Writing tests is perhaps the most humane thing we engineers get to do.
We can be better engineers, little by little, every day. Stay on the lookout for opportunities to learn new tools and leverage them to deliver quality software. If you don't, it just might be—to quote the late, great Bill Paxton—"game over, man, game over."