28 October 2013

Show Me Numbers! Automated Browser Performance Testing

Here's my talk from AtlasCamp 2013, Atlassian's annual plugin developer conference held in Amsterdam.

My topic is Automated Browser Performance Testing.

JavaScript developers often talk about performance, always tweaking to make the user experience faster. The good developers carefully and accurately measure this performance in multiple browsers. The great developers automate this measurement.

In this talk I tell the full-stack story of the design and implementation of User Experience (UX) performance tests in JIRA. Learn techniques critical to improve not only actual performance, but also perceived performance - and when they are not the same thing!

23 April 2013

Nice Syntax for Ignoring Exceptions in Groovy

Save your breath. I don't need to hear how we shouldn't ignore exceptions. I agree with you. We should use exceptions as a part of an API and we should accept exception handling as an important, fundamental design element in our code. Happy?

OK now, sometimes you know you need to ignore an exception. Often it's a narrow exception type, for example, the following exceptions are frequently ignored with good reason (usually the exception cannot happen in practice):

MalformedURLException
NumberFormatException
IllegalAccessException
UnsupportedEncodingException

Groovy inherits Java's exception handling syntax, the try, catch, finally blocks. While writing some code that ignored the exceptions I wondered what a Groovier way to do things would be. The following Config class is a kind of bean containing validated values read from a properties file. The defaults remain if the config file cannot be parsed for the given value.


Notice how the try {...} catch {...} looks like Groovy's closure passing syntax, as if there was a try method which is being passed a block. In fact I think Scala's equivalent is literally this (also, Scala is awesome). 

But while it looks that way, it doesn't quite act that way. 

But could it?

What I really wanted to have was something like this:

ignore (NumberFormatException) {
    threads = Integer.parseInt(config?.threads as String)
}

This would ignore only a NumberFormatException if it was thrown, if something else was thrown, it would be thrown out of the present block.

As I expected this is pretty easy to implement in Groovy and after a brief search I couldn't really find anything in the standard libraries or the blogosphere at large.

Here's my implemenation with some example code:



Run this on the live GroovyConsole yourself.

What do you think? 

31 January 2013

Activate Remote Disc with Command Line on MAC OSX

On Macs without a DVD drive (i.e. everything these days) OSX is configured to easily allow use of a "Remote Disc" wherever another machine has enabled DVD/CD sharing.

The trouble is the "Remote Disc" device was not showing up on my old Mac Mini's (broken DVD drive) and although I can usually avoid it I recently wanted to install something from a DVD. This machine has a DVD drive which is presumably why the option did not show up in the Finder preferences under Sidebar. 

Fortunately these preferences can be turned on with a command line as follows:

defaults write com.apple.NetworkBrowser ODSSupported -bool true
defaults write com.apple.NetworkBrowser EnableODiskBrowsing -bool true

Enter that in the terminal and then Restart Finder using the Force Quit menu so the settings take effect.


Remote Disc icon now shows up in the Finder Sidebar under Devices

Then you should see the remote disc icon and can mount remotely shared DVDs and CDs. Note that you must also enable DVD/CD sharing on the machine with the disc.


19 October 2011

New Stuxnet Virus Beta Version Discovered

Stuxnet, the most advanced computer virus ever discovered (by a long way), generally believed to be the first sophisticated example of a virus being used as an international military weapon, just got an upgrade. Most likely from its original authors.

At least, according to Symantec's analysis lab where a recently found version has been dissected on the operating table, it looks like the versions found in the wild "in Europe" were targeted test runs of a new version that was partly recompiled from the original source code which has so far remained secret.

The new version carries a different payload to the original Stuxnet which, it is generally believed, was intended to shut down or disrupt Iranian nuclear facilities. In fact the new version's actual payload is pluggable. Modules can be loaded dynamically from a command and control server located in India.

Duqu, as the new version is called, can read and write files, execute programs, capture keyboard input (to collect passwords) and generally open back doors for later use by the unknown attackers.

Check out the great article on this by Read Write Web and for a quick recap on the original Stuxnet, check this nicely produced - if creepy - video by Hungry Beast in Australia:

09 October 2011

Why Programming is Hard

Programming is not hard because computers are fussy. Programming is hard, but not for that reason.

There are plenty of pesky surface details in managing programming projects and programmers regularly find themselves burdened with managing tools and configuration details. More so when using Maven (Zing!).

But this is just work. Annoying repetitive work. It's not what makes programming hard.

In fact, programming is hard because there are so few limits to what you can do.

Certainly there are limits to computability, and there are famous problems and algorithms that define the state of the art in computer science, but for many programmers, knowing about these limits is enough to prevent them from pushing up against the hardness of these limits on a regular basis.

Programming is hard because the only limit is our own mental capability. It's always going to appear to be hard. It doesn't matter how smart you are.

If programming is not hard, then you're doing it wrong. Your brain is not being fully utilised? You are being limited by your tools and you need to replace them.

Any time spent dealing with repetitive details, remembering lots of things or even managing huge amounts of code, this time waste generally can be eliminated with more programming! This assumes you're in a postion to change the system enough and perhaps this is where practical project tradeoffs impose. And perhaps "the business" isn't interested in funding improvements to the open source build tool, or switching from Java to Scala. (Double Zing!) But those are not programming problems.

For me, because software is so soft, the goals can be altered subtly, continuously, destructively. Especially when ill-defined. The hardest thing generally comes down to dealing with the confusion of some form of not being clear what I'm trying to do. Unravelling the "is it this or is it that" trail of mental plans that led to the curent state. When this happens in the large, a software project is almost certain to fail.

But maybe that's just me. Do you think programming is not hard? Do you think it's hard for a different reason?

30 September 2011

32 Bit 64 Bit - All up in your Bitness

Here's one for the logophiles. English usage nerds. I'm coming out as a bit of a language nerd officially, today, so consider yourself warned.

In film they talk of footage. In transport, haulage. Then there are words like passage, patronage and of course cleavage. At school my friends and I would regularly put "-age" after words. "Foodage?" would mean "are you hungry?". "Skateage" was time spent skateboarding.

So what is it when you want to refer to how many bits something is?



You know your latest laptop is 64 bits and your last one was 32 bits. You know about the 16 bit retro video games like Sonic the Hedgehog and maybe even the "8 bit era" of the Commodore 64. So what do call that ... bitness?

I've heard it called bitness, and tweeted about it a while ago and got some interesting alternatives:

  • bitness
  • bittage
  • bittitude
  • bitality
  • bitosity
  • bitch
I guess the last one is a little challenging, linguistically. Maybe German?

Of course I had lots of replies to my tweet saying well you're talking about "word length" or "machine architecture" or other term specific to the usage. Well what if I'm talking about sound samples? "Is that a 16 bit sample or a 32 bit one?". Or colour space in digital imagery? Is that 24 bit colour? Ignoring bittitude is just like saying you can't call it footage, you have to call it film length.

We need to agree on a word that covers bitness and I think I'm coming down on the side of bittage.

Of course this is a question of crucial importance to the world (not - but I did warn you).

What do you think?

image credit jeffrey onMr Breakfast

24 August 2011

Last Minute Usability

Usability, like scalability and security cannot be added to a finished product any more than flavour can be added to a finished meal.


Delightfully Delicious
delicious photo by Shutter Ferret

19 April 2011

This is NOT the 80:20 Rule You Are Looking For


The 80:20 rule is based on the assumption that there is (often) an 80% (ish) subset of the realised value for a project that only costs 20% of the resources (e.g time) to complete. The rule, actually the Pareto Principle popularised by Richard Koch and others, like all rules of thumb, can be applied incorrectly.

It's not a license for doing things half-arsed.

06 April 2011

Automated Testing in Games

So I clearly upset a few game developers with my last blog post. Sorry about that! I do find it a bit surprising how strong some of the emotions were but I probably deserved the beating I got on reddit.

Second my post was too long and based on my stats, it looks like less that 1.0% of visitors read the article. Unless they can read a couple of thousand words in 20 seconds. Again, that was my fault.

But what was interesting was the responses to my article were pretty clearly divided into two main groups (disagreeing and flaming) and one small group (agreeing):


  1. You are on crack, we do this stuff (automated testing, etc) better than anyone.
  2. Game developers are moar awesome than normal developers and unit testing, agile and whatever else you are pushing doesn't apply to the game industry because of [one of]:
    1. How can you unit test games anyway that doesn't even make sense!
    2. Deadlines. We don't have time.
    3. Games have hundreds of gigs of assets.
    4. Console contracts / monetisation models etc.
  3. I agree that the game industry suffers from some poor tool and process adoption habits. 
Those who use these practices I've been talking about - and to keep things specific I think automated testing is an excellent candidate to focus the debate - these people generally didn't seem too upset with me. In fact I know plenty of game developers in this category. 

Those who believe that their game studios are backward generally cannot speak publicly about ANYTHING with the only few exceptions being people who have left the industry, like munificent who says:


I worked at EA for eight years. My running joke-that-was-not-a-joke was that we were twenty years behind the times, but I was probably exaggerating. Fifteen is a pretty good estimate.
At EA, I think the problem was that the company culture was built like this:
  1. A bunch of bedroom hackers in their teens started studios and shipped games ten to twenty years ago.
  2. Those games got successful, and the studios scaled up. The founders hired young people under them.
  3. More staff churn. Fresh grads come in, burned out experienced coders go out.

Munificent went on to qualify and contain his own statements as generalisations (as I did with mine).

Tool and process adoption in development teams is of particular interest to me and I have an ongoing conversation with several developers about the barriers to solving problems they have in this area.

As for the middle group, the question boils down to this:

If automated testing is to some degree not suited to the game industry, then why?

If the answers are things like "How could you test Starcraft 1?" or "You can't test for things like game balance" then I'm sorry that's not a convincing reason! How is it obviously impossible to automatically test Starcraft? I'm sure game developers heavily using automated testing (perhaps even Blizzard) will call bullshit and say "yes you can automatically test it".

You Can't Write a Test for X

I've seen so many of these debates and had them myself since the 90s and I've come to believe slowly by being convinced on a case by case basis that you can test the vast majority of what you think you can't test. I've learned this working in a number of different software projects in different industries.

If the answer is the cost benefit trade-offs on this project lead us to conclude it's not worth doing here and there then that's interesting. I'm looking for examples because it may well be a tooling problem.

I'm not saying that everything can be tested automatically, and perhaps the up-front cost of writing the tests is never outweighed by the benefits of having the tests because of game industry specific business features or whatever, but I think many of those things are currently subject to rapid change.

Game Revenue

WOW is a good case study. This is the sort of game that couldn't be funded 15 years ago. It plays similarly to the MUDs of the early 90s with a lot better graphics etc. but the market was too small for such a grand vision then. We were on 14.4k modems (or worse) and the number of internet users was tiny. WOW has recurring revenue. This is a feature unlike traditional games but not unlike traditional and modern non-game software.

With ongoing revenue you can fund ongoing development in a more continuous way. Updates can be done more frequently and codebases are encouraged to have longer lifespans. Longer lifespan codebases mean legacy code. There's nothing to encourage you to invest more in automated testing like having to maintain legacy code!

I think these are important factors, driven by the modern consumer internet and the market forces that come from it. These forces push game code towards a scenario where automated testing has a bigger benefit.

That doesn't address the cost side. You might think that the costs of automated testing, particularly unit testing, are fixed. Well in my experience of unit testing for the past 12+ years, the biggest barrier to automated testing is bad design. Here's what munificent (who has since moved on to Google) had to say about that:

Unit testing would be awesome, but the thing is... you need units for that. A lot of the game codebases I've seen are closer to "giant hairball" than "collection of components", which isn't amenable to unit testing.

When I was writing software in the 90s and 80s I never wrote automated tests. I thought testing had to be manual and had to be done when the system was "finished". I don't think that any more. When someone tells me that they have to test that way I ask why and I think I'm right to be suspicious of weak, vague answers.

05 April 2011

Are Game Developers 15 Years Behind The Rest of Us?

Game developers have a reputation, deserved or not, of being a nasty unwashed rabble with no process, no discipline and no skills outside of graphics performance and playing video games. They shoot from the hip, stay up all night and cut corners to hit the ship date, just like all developers, but they commit these sins to a much greater degree than the average developer, or so it goes.

It seems much more common that games developers, as opposed to non-game developers will not have automated unit testing and measure code coverage, adopt agile processes, use proper source code management systems, have proper backups (!), do code reviews, track bugs and do continuous integration.

Obviously, there are going to be game developers who do follow these engineering practices, but a reputation is a form of stereotype.

So I'm wondering how much truth there is in this or if it's a total myth.

Are game developers really less skilled in the big picture of software development? Are they really 15 years behind the rest of the industry on common best practices? Do their knuckles really drag on the ground? And if so, why?



I'm no expert in game development but I've always been very interested in it. My observation of a number of game projects and my contact with several game developers suggests there are a number of factors that could contribute to the possibility that game developers do not follow many of these practices.

Games Projects Are Not Like Other Software?

It seems true that game projects are usually dominated by a media production workflow. Like CG movies, modern AAA megabudget console titles have teams of artists who churn out gigabytes of high quality graphics and other media. The game project's process and its project management is totally dominated by the needs of this media production. Game project managers are quite often called "producers", the companies that make games often call themselves "studios" and, like movies and music, there are "publishers" who handle distribution and marketing. Funding models for many game studios likewise seem to follow the movie industry.

Interestingly, indie developers react strongly against this model of game development and argue that better games are the true goal, not bigger budgets.

By contrast, a multi-million dollar non-game software project will have a team where visual design work, including "UX" (user experience) design, often takes up less than 10% of total resources. By contrast, software developers and possibly business analysts (the equivalent of a game designer) will represent the majority. Obviously there is huge variation here, but the contrast is clear.

The basic idea behind this factor is that the division of labour on the project tips the scales towards a different set of best practices. Ones that are, arguably, better for making the cost-benefit tradeoffs and  managing the risks of those projects.

Game Developers Don't Know Better?

I've been told that game developers mostly started in their bedroom and their skills have not expanded very far due to lack of exposure. Nobody convinced them to use unit testing, so they don't do it. Game development tends to be a passion. I know lots of skilled developers outside of game development who started coding all night in their bedroom. But just like they no longer use BASIC or name their variables with expletives, they no longer code games. I'm describing myself too here, I started programming like this.

Another aspect to this is that a lot of self-professed game developers participating in online discussions about this stuff literally are kids in the bedroom!

This factor is not very convincing but it seems that the game development sector is separated from other software development and the rule of passion is enforced in game development companies in both directions. Game developers feel strongly that they would never want to work on other projects and game development companies have a strong policy of only hiring people who have a passion for games. If this were true for Insurance companies, they'd never find candidates. People don't get passionate about insurance. Part of the truth of this factor may be that the segregation of game developers is so strongly self-policing.

With some exceptions, there are still not many game development courses run by universities, whereas the materials in the general software engineering and computer science courses are often specifically tuned to non-game industry problems. Think of the number of examples in programming tutorials about e-commerce or payroll!

This lack of tertiary education might not be such a big deal, except the game developers I've spoken to think that it is. They believe that game development is a speciality that needs special courses. Managing the media pipeline mentioned above is often cited as a part of this. As are the fast-moving hardware technology advances such as GPU architectures and the iPhone-style app-store project publishing aspects of the game industry. Business IT courses are only gradually starting to touch on this.

Games Don't Ship Upgrades?

Traditionally, games are shipped in their final state.

Back in the olden days people bought physical media in a physical retail outlet. I'm only partially joking here, clearly that method is quickly disappearing and luminaries like John Carmack have predicted next-generation consoles to cater predominantly to online distribution. Games that require dozens of gigs of media have seemingly had no alternative but to rely on ever increasing optical media densities to ship their bits.

While most games these days seem to have upgrades that are made available online, these are universally free, contain bug fixes only and are staffed minimally. Games do not generally go through the same lifecycle as, say, Microsoft Office where a new version of the same software is released on a regular basis and customers usually pay to upgrade.

The effect of this is many of the practices in non-game software development that pay off only in the long term don't seem to be worth the trouble in the short term.

While this factor may have been true in the past it is becoming increasingly less convincing across the board. With the rapid growth of gaming segments such as MMOGs, casual games and mobile games on always-online devices like iOS and Android phones and tablets, I expect a greater proportion of games to get their revenue from long-running projects that include multiple upgrades or in-game purchases that fund continued development of the software.

When the software is under continual development, as is most pronounced in SaaS, the payoff for the investment in quality engineering practices becomes more obvious. Games that are developed without such practices, as with non-game projects, will often become bogged in technical debt or suffer disasters that severely impact the businesses behind them.

By contrast, those game development studios that adopt automated unit testing regimes and invest in infrastructure like modern version control systems (e.g. Git, Mercurial) and Continuous Integration (CI) not to mention bug tracking and potentially things like code review, have a potential competitive advantage that can lower the cost of development over time.

Games Can't Be Unit Tested?


I've heard several game developers, state that game development is different in a technical sense such that unit testing (and other related engineering practices) are not suited to game systems. While, when pressed, these people will concede that certain parts of game development are like any other software in their suitedness to unit testing etc. they argue that there is a fundamental mismatch.

To be frank I find this argument to be just as weak as it is in non-game development projects. I'm not saying there is nothing that can't be unit tested because that's not even the question. The question is whether the effort spent on unit testing is outweighed by benefits derived. The trouble is that the vast majority of "things that can't be unit tested" that I have been shown over the years are really just design problems in the code that only become apparent when a unit test is attempted. Making a small change to the production code fixes this problem. For those developers (both inside and outside game development) who have never gained the benefit of a good regression test suite, the idea of changing the production code to make it easier to test sounds like pure madness.

Of course it's not pure madness, it's just the scientific method.

Game Projects Are Too Hardcore For That Crap 

I've heard this one from game developers who, let's just say, do their bit to perpetuate the stereotype of the macho code-all-night-eating-pizza-guzzling-caffeine-optimising-the-engine dude. All nighters are not generally seen as sustainable by most enlightened teams I've worked with and whenever it's done it's seen as fail. Fail happens. Heroic efforts are sometimes necessary. And if you feel that the burn out that results is a badge of honour then it can be kind of fun. It's like the pride an extreme sports junkie has of their broken limbs and sick x-rays. We all know this is true because most of us have been there.

Like a "breakdancing battle" I remember many schoolyard bragging sessions about how many sprites I could animate on screen at once. My friends challenged me to make more impressive graphics than them and I relished the challenge. I believed in my ultimate mental power over the machine. I felt invincible and omnipotent in my coding. And when I look back on the code I wrote then I can barely read it.

Perhaps it's optimal and can't be improved. Only half of that is certain. It can't be changed at all. I'd rather disassemble a binary than read my own teenage source code.

But while it may or may not be fun, it's not sustainable or profitable, except in very narrow windows of opportunity.

To some game developers, game development is like going to war. You have to be tough like a soldier and all the props and niceties of civilian life (unit testing, source control) are luxuries you don't get on the battlefield of a game project. There's no time. There's too much chaos and that goes with the territory. Game projects necessarily have grueling schedules because they must to survive. It's a tough market and if you're not prepared to wade through the swamp in combat boots and 30kg packs then you're not cut out for game development and you should go back to the unicorn and rainbow world of civilian development because you are soft.

I think these guys (invariably this is a guy thing) are delusional. They're living in their own game world.

This sort of attitude will convince starry-eyed coder kids of who has the most chest hair, but it seems to blind some game developers to the simple fact that games are made of the same material as all software: source code. Software practices are solutions to a common problem: a faster stream of features with an acceptable level of performance and an acceptable defect density, not to mention making the right direction and priority decisions.

In the past I would have been more convinced by arguments about the absolute need in games for hyper-performant code and crazy schedules but these days I think those things are less true. As with non-game development before it, the necessity of maximal performance is becoming a marginal one, suited to fewer projects less of the time. Games used to win success on technology and now this happens less than ever.

The online games markets, especially so-called "casual games" have grown phenomenally in recent years such that established games companies are taking notice of new generation companies that understand the importance of community, longevity, and viral distribution channels like facebook.

I think we'll see more legacy game code.

All of this suggests that the majority of game companies are less and less like the mega hit tech-driven games of the past few decades and a larger part of the growing gaming market is in projects that could easily benefit from decent software development practices. Maintainable code, automated test coverage etc. 

Small Projects?

There are a lot of small software projects in the games industry. Small iPhone projects probably don't have enough bugs to warrant industrial strength bug tracking, especially when the whole team is sitting around one table and the duration of the project is less than 6 weeks. 

Several of the engineering practices I've mentioned seem to better suit larger teams and projects, releasing versions on a regular basis. Small projects that aren't like this arguably find benefits of these practices are reduced.  

Absolutely I consider this to be a relevant factor for code review and sophisticated bug tracking, but I think more of the reason for small game projects not having source control or unit testing are questionable. 

Flash, for example, very popular for online games, strikes me as one of the worst platforms to integrate source control and unit testing into. I've done some flash programming and I just gave up and went with the flow. No unit tests and no source control. 

I survived.

However if I was going to write games in Flash on an ongoing basis, I would find the flashunit framework I'm sure exists, and if it doesn't I would build it. I'm vaguely aware of methods in Flash and ActionScript to solve source control issues like merging and you can bet that I would prioritise getting that stuff working. If I had a team working on a MMOG in Flash, I would make sure I put in a Continuous Integration engine, probably Bamboo and not just because I work for Atlassian. After saving my arse several times I'm convinced these tools and practices are worth the trouble to set up. Hopefully my fellow game developer team mates would not be resistant.

What Do You Think?

I'm not presenting this stuff about game developers as fact, and I'm making a lot of generalisations, a lot of assumptions. I'm not trying to insult any game developers, in fact I'm interested in learning more about game development, especially on the tooling, process and architecture side.

Am I being unfair? Am I out of date? Am I wrong?

Feedback, flames and factual corrections are most welcome.