It seems much more common that games developers, as opposed to non-game developers will not have automated unit testing and measure code coverage, adopt agile processes, use proper source code management systems, have proper backups (!), do code reviews, track bugs and do continuous integration.
Obviously, there are going to be game developers who do follow these engineering practices, but a reputation is a form of stereotype.
So I'm wondering how much truth there is in this or if it's a total myth.
Are game developers really less skilled in the big picture of software development? Are they really 15 years behind the rest of the industry on common best practices? Do their knuckles really drag on the ground? And if so, why?
I'm no expert in game development but I've always been very interested in it. My observation of a number of game projects and my contact with several game developers suggests there are a number of factors that could contribute to the possibility that game developers do not follow many of these practices.
Games Projects Are Not Like Other Software?
It seems true that game projects are usually dominated by a media production workflow. Like CG movies, modern AAA megabudget console titles have teams of artists who churn out gigabytes of high quality graphics and other media. The game project's process and its project management is totally dominated by the needs of this media production. Game project managers are quite often called "producers", the companies that make games often call themselves "studios" and, like movies and music, there are "publishers" who handle distribution and marketing. Funding models for many game studios likewise seem to follow the movie industry.
Interestingly, indie developers react strongly against this model of game development and argue that better games are the true goal, not bigger budgets.
By contrast, a multi-million dollar non-game software project will have a team where visual design work, including "UX" (user experience) design, often takes up less than 10% of total resources. By contrast, software developers and possibly business analysts (the equivalent of a game designer) will represent the majority. Obviously there is huge variation here, but the contrast is clear.
The basic idea behind this factor is that the division of labour on the project tips the scales towards a different set of best practices. Ones that are, arguably, better for making the cost-benefit tradeoffs and managing the risks of those projects.
Game Developers Don't Know Better?
I've been told that game developers mostly started in their bedroom and their skills have not expanded very far due to lack of exposure. Nobody convinced them to use unit testing, so they don't do it. Game development tends to be a passion. I know lots of skilled developers outside of game development who started coding all night in their bedroom. But just like they no longer use BASIC or name their variables with expletives, they no longer code games. I'm describing myself too here, I started programming like this.
Another aspect to this is that a lot of self-professed game developers participating in online discussions about this stuff literally are kids in the bedroom!
This factor is not very convincing but it seems that the game development sector is separated from other software development and the rule of passion is enforced in game development companies in both directions. Game developers feel strongly that they would never want to work on other projects and game development companies have a strong policy of only hiring people who have a passion for games. If this were true for Insurance companies, they'd never find candidates. People don't get passionate about insurance. Part of the truth of this factor may be that the segregation of game developers is so strongly self-policing.
With some exceptions, there are still not many game development courses run by universities, whereas the materials in the general software engineering and computer science courses are often specifically tuned to non-game industry problems. Think of the number of examples in programming tutorials about e-commerce or payroll!
This lack of tertiary education might not be such a big deal, except the game developers I've spoken to think that it is. They believe that game development is a speciality that needs special courses. Managing the media pipeline mentioned above is often cited as a part of this. As are the fast-moving hardware technology advances such as GPU architectures and the iPhone-style app-store project publishing aspects of the game industry. Business IT courses are only gradually starting to touch on this.
Games Don't Ship Upgrades?
Traditionally, games are shipped in their final state.
Back in the olden days people bought physical media in a physical retail outlet. I'm only partially joking here, clearly that method is quickly disappearing and luminaries like John Carmack have predicted next-generation consoles to cater predominantly to online distribution. Games that require dozens of gigs of media have seemingly had no alternative but to rely on ever increasing optical media densities to ship their bits.
While most games these days seem to have upgrades that are made available online, these are universally free, contain bug fixes only and are staffed minimally. Games do not generally go through the same lifecycle as, say, Microsoft Office where a new version of the same software is released on a regular basis and customers usually pay to upgrade.
The effect of this is many of the practices in non-game software development that pay off only in the long term don't seem to be worth the trouble in the short term.
While this factor may have been true in the past it is becoming increasingly less convincing across the board. With the rapid growth of gaming segments such as MMOGs, casual games and mobile games on always-online devices like iOS and Android phones and tablets, I expect a greater proportion of games to get their revenue from long-running projects that include multiple upgrades or in-game purchases that fund continued development of the software.
When the software is under continual development, as is most pronounced in SaaS, the payoff for the investment in quality engineering practices becomes more obvious. Games that are developed without such practices, as with non-game projects, will often become bogged in technical debt or suffer disasters that severely impact the businesses behind them.
By contrast, those game development studios that adopt automated unit testing regimes and invest in infrastructure like modern version control systems (e.g. Git, Mercurial) and Continuous Integration (CI) not to mention bug tracking and potentially things like code review, have a potential competitive advantage that can lower the cost of development over time.
Games Can't Be Unit Tested?
I've heard several game developers, state that game development is different in a technical sense such that unit testing (and other related engineering practices) are not suited to game systems. While, when pressed, these people will concede that certain parts of game development are like any other software in their suitedness to unit testing etc. they argue that there is a fundamental mismatch.
To be frank I find this argument to be just as weak as it is in non-game development projects. I'm not saying there is nothing that can't be unit tested because that's not even the question. The question is whether the effort spent on unit testing is outweighed by benefits derived. The trouble is that the vast majority of "things that can't be unit tested" that I have been shown over the years are really just design problems in the code that only become apparent when a unit test is attempted. Making a small change to the production code fixes this problem. For those developers (both inside and outside game development) who have never gained the benefit of a good regression test suite, the idea of changing the production code to make it easier to test sounds like pure madness.
Of course it's not pure madness, it's just the scientific method.
Game Projects Are Too Hardcore For That Crap
I've heard this one from game developers who, let's just say, do their bit to perpetuate the stereotype of the macho code-all-night-eating-pizza-guzzling-caffeine-optimising-the-engine dude. All nighters are not generally seen as sustainable by most enlightened teams I've worked with and whenever it's done it's seen as fail. Fail happens. Heroic efforts are sometimes necessary. And if you feel that the burn out that results is a badge of honour then it can be kind of fun. It's like the pride an extreme sports junkie has of their broken limbs and sick x-rays. We all know this is true because most of us have been there.
Like a "breakdancing battle" I remember many schoolyard bragging sessions about how many sprites I could animate on screen at once. My friends challenged me to make more impressive graphics than them and I relished the challenge. I believed in my ultimate mental power over the machine. I felt invincible and omnipotent in my coding. And when I look back on the code I wrote then I can barely read it.
Perhaps it's optimal and can't be improved. Only half of that is certain. It can't be changed at all. I'd rather disassemble a binary than read my own teenage source code.
But while it may or may not be fun, it's not sustainable or profitable, except in very narrow windows of opportunity.
To some game developers, game development is like going to war. You have to be tough like a soldier and all the props and niceties of civilian life (unit testing, source control) are luxuries you don't get on the battlefield of a game project. There's no time. There's too much chaos and that goes with the territory. Game projects necessarily have grueling schedules because they must to survive. It's a tough market and if you're not prepared to wade through the swamp in combat boots and 30kg packs then you're not cut out for game development and you should go back to the unicorn and rainbow world of civilian development because you are soft.
I think these guys (invariably this is a guy thing) are delusional. They're living in their own game world.
This sort of attitude will convince starry-eyed coder kids of who has the most chest hair, but it seems to blind some game developers to the simple fact that games are made of the same material as all software: source code. Software practices are solutions to a common problem: a faster stream of features with an acceptable level of performance and an acceptable defect density, not to mention making the right direction and priority decisions.
In the past I would have been more convinced by arguments about the absolute need in games for hyper-performant code and crazy schedules but these days I think those things are less true. As with non-game development before it, the necessity of maximal performance is becoming a marginal one, suited to fewer projects less of the time. Games used to win success on technology and now this happens less than ever.
The online games markets, especially so-called "casual games" have grown phenomenally in recent years such that established games companies are taking notice of new generation companies that understand the importance of community, longevity, and viral distribution channels like facebook.
I think we'll see more legacy game code.
All of this suggests that the majority of game companies are less and less like the mega hit tech-driven games of the past few decades and a larger part of the growing gaming market is in projects that could easily benefit from decent software development practices. Maintainable code, automated test coverage etc.
There are a lot of small software projects in the games industry. Small iPhone projects probably don't have enough bugs to warrant industrial strength bug tracking, especially when the whole team is sitting around one table and the duration of the project is less than 6 weeks.
Several of the engineering practices I've mentioned seem to better suit larger teams and projects, releasing versions on a regular basis. Small projects that aren't like this arguably find benefits of these practices are reduced.
Absolutely I consider this to be a relevant factor for code review and sophisticated bug tracking, but I think more of the reason for small game projects not having source control or unit testing are questionable.
Flash, for example, very popular for online games, strikes me as one of the worst platforms to integrate source control and unit testing into. I've done some flash programming and I just gave up and went with the flow. No unit tests and no source control.
However if I was going to write games in Flash on an ongoing basis, I would find the flashunit framework I'm sure exists, and if it doesn't I would build it. I'm vaguely aware of methods in Flash and ActionScript to solve source control issues like merging and you can bet that I would prioritise getting that stuff working. If I had a team working on a MMOG in Flash, I would make sure I put in a Continuous Integration engine, probably Bamboo and not just because I work for Atlassian. After saving my arse several times I'm convinced these tools and practices are worth the trouble to set up. Hopefully my fellow game developer team mates would not be resistant.
What Do You Think?
I'm not presenting this stuff about game developers as fact, and I'm making a lot of generalisations, a lot of assumptions. I'm not trying to insult any game developers, in fact I'm interested in learning more about game development, especially on the tooling, process and architecture side.
Am I being unfair? Am I out of date? Am I wrong?
Feedback, flames and factual corrections are most welcome.