Developers claim that their productivity is hampered by employers who force them to work on old, slow hardware. Just how old is the development system you're forced to use?
A couple of years ago, CIO had a feature story called 20 Things in 20 Minutes which listed short, tactical to-do items CIOs could take on to improve their organizations. Each CIO writer and editor (I was in the latter category at the time) was to supply one such item. After interviewing several developers and asking on discussion lists, I submitted three — and my favorite landed on the cutting room floor. I hate when that happens.
In short, said developers: 20 minutes is plenty of time to sign an equipment requisition to buy new hardware for the development staff. "Buy modern machines for developers. Buy graphics cards to allow dual monitors. Buy monitors," summarized one programmer. (And while you're at it, upgrade the Window 95 Pentium PC that's acting as the source code repository. A new print server would be a good idea, too.)
There was passionate consensus about this point, at least in one online developer community: Hardware is the easiest thing to fix, but apparently not the first thing that comes to mind for managers.
"The single most efficient way to improve quality is to increase productivity, and the single most important and efficient way to improve productivity is to give the engineers blazing fast computers and big, dual monitors," added a developer named Jon (I'm not sure when the statute of limitations on "you may quote me" runs out, so I'm coy with names here). "Developers have to deal with more user interfaces and manage more information than any computer users out there... To run all this stuff we need lots of RAM and CPU."
This surprised me. I was aware that Joel-on-Software had fast computers on his Programmer's Bill of Rights but I thought that was a baseline — not a wish list. Until that discussion a few years ago, I'd assumed that typical developers worked with the latest hardware because that's the way it's been at the Schindler bitranch. It's one positive side effect of spending years as a computer consultant and independent contractor; you might have to buy the computer yourself, but at least you can prioritize the purchase when you decide it's important.
But I'm still not certain whether this is an epidemic or an isolated problem. After spending a week at the Open Source Convention, I decided that based on the number of MacBooks, a casual observer would assume it was an Apple conference. I've certainly heard plenty of anecdotal evidence to argue that some developers, given the choice, would take an older iMac or a aged relic running the Linux distro of their choice, rather than the fastest PC running Windows. But many don't have a choice... and nobody even thinks to ask them how hardware might improve their productivity.
So just how irritated are you with the development hardware that your employer has chosen for you? What do you most wish they would change?