Can People Really Program 80 Hours a Week?
Obviously one can program 80 hours a week. The question is whether you should.
There are clearly two sides to the question: is it good for the programmer to program more hours? and is it good for the project for programmers to program more hours? It's not necessary to pretend that these are the same question: there are exceptionally important projects with close deadlines (see Marc Stiegler's David's Sling for a fictional example) where the end result is so important that destroying programmers is acceptable collateral damage so long as the project is completed within a very tight time frame.
So: is it good for the programmer to program more hours in a week?
On average, no. Sure there are some programmers who can maintain an 80 hour week indefinitely, but the modal programmer will suffer loss of productivity as well as undesirable physical and mental side effects after some period of extended additional work.
Studies as far back as the first decade of the 20th century have demonstrated lessened productivity and increased accident rates with 10 hour work days compared with 8 hour work days. Modern studies of stress and sleep deprivation show that losing as little as an hour of sleep a night can have an effect similar to taking one or two stiff drinks.
Personal experience suggests that there is an average threshold after which effects become more severe -- somewhere between two and four weeks of 50+ hours. That doesn't mean there are no effects earlier, it's just that the effects become more obvious with time and hours.
How about the project? Is it beneficial to the project to work more hours?
In some limited cases, yes. As productivity (output per unit time) decreases, increased work time can result in an actual increase in productive output. So if you really need to have something done next week, you can get extra work in by crunching.
Unfortunately, productivity continues to decline over time as extra hours are put in, and while there is a limit to the number of additional work hours that can be put in, there is no practical limit to the reduction in productivity -- it can go negative, and do so in spectacular fashion (insert gratutitous story about wiping out codebase or fragging development machines here).
Determining the breakeven point is hard: programmer productivity is a complex and difficult measurement that most companies are simply not willing to invest in. Programmers themselves have concerns that some artificial measure will be applied to their performance reviews. Managers who know programmers have concerns that programmer behavior will be modified to "game" any artificial measure to the detriment of some nebulous "actual productivity". As someone who's been both a manager and a programmer, I think both concerns have validity.
I could write extensively about specific measures of output, but suffice to say that it is a difficult task and there is no commonly accepted rigorous measure (especially in the games business) of programmer output, and thus none of productivity (output per unit time).
In light of this lack I fall back upon my personal experience: crunching (50+ hour work weeks) has limited utility to most projects. After only 2 - 4 weeks of crunch most programmers have lost so much productivity that they are getting 40 or less hours of "real work" done. From a project point of view, they would be getting at least as much real work done if they dropped back to 40 - 50 hour weeks, and from a personal point of view they would be much healthier and happier. And they would be saving something up for that next crunch time.
needs some expansion:
The "right" way to answer the question "how do I produce maximum output" (which is what a project probably wants) is to figure out the relationship between hours worked and productivity, then graph total output per day according to hours worked per day, and select the maximum value (obviously there would be more information than this -- you'd need to evaluate productivity based upon previous numbers of hours worked as well, for example).
Unfortunately, figuring out how much output a programmer produces is hard. Sure, you can measure KLOC (thousands of lines of source code) pretty easily, but that's not widely considered a good measure of actual productive output. You can go to all the effort and trouble of calculating Function Points for your project and see how many each programmer produces in how long, but you'd also have to go to the trouble of assigning debugging work to the Function Point that the problem was caused by (and thus the programmer who wrote the code), and there would inevitably be situations where it wasn't clear just where the time should be allocated. But the fact of the matter is that no game company is going to expend the time up front to spec a project to the point where FPs can be figured. It's hard enough to get companies to produce a first or second draft WBS, much less a complete (and believable) set of tasks broken down to single day precision (or better).
Even if you got a full FP breakdown of your project, would that provide all the information you required? I think not, because there is more to tasks than the number of FPs involved. Because FPs are a user-oriented size measurement, they do not (in my opinion) adequately measure the size of highly interactive programs like computer games. [I'm willing to be educated on this point, BTW].
In my opinion, there is no extant rigorous measure of computer programmer output that adequately deals with program size, high levels of interactivity, and difficulty of programming task. And if there were, it would be too expensive (in terms of time and rigidity) to apply to relatively agile projects like computer games.
Which leaves us with a theoretical discussion of productivity, inadequate measurements, and personal experience. Thin reeds, perhaps, on which to hang $10 million development budgets.
But hey, if management consulting was easy, anyone could do it!
I realize that common-sense is out of fashion as a management technique, and it might be pointed at the idea that "more hours" == "more work". But let's try applying it anyway:
What happens to people who work a lot of hours? They get stressed. They get tired. They make more mistakes. If they're not terribly dedicated, they start going through the motions instead of being engaged in their work. Maybe they find other things to do during their time "at work", if they can. They certainly start having things they must do (like eating, like going to the doctor) impinge upon their "work time". In other words, their "work time" becomes less and less productive. They produce less output for each hour worked.
How much less? Hard to say (see discussion above about how precise measurement of programmer output is hard). But there are industrial studies which show overall output increases if you reduce a workday from 9 hours to 8. Read that again to make sure you got it: an 11% reduction in work hours (from 9 to 8 daily) resulted in an (unspecified) increase in output. That means the increase in hours from 8 to 9 resulted in at least an 11% reduction in productivity on an hourly basis.
I don't know about you, but I think it's common sense that the increase from 8 to 9 hours will be less of a productivity hit than an increase from 8 to 10 hours. For that matter, from 9 to 10 hours. I think each hour you take beyond the first will hit your productivity harder. So if you're losing 11% the first hour, what do you lose the second hour? 15%? Even if you assume the 11% remains constant, you lose productivity fast, and after about 10 hours you lose output at a tremendous rate:
Daily Hours | Productivity | Daily Output | 5 day Weekly Hours |
---|---|---|---|
8 | 100% | 100% | 40 |
9 | 89% | 100% | 45 |
10 | 78% | 98% | 50 |
11 | 67% | 92% | 55 |
12 | 56% | 84% | 60 |
Do I believe this? Yeah, I do. I think the model is simple enough that it probably breaks down eventually. As I said above, I think that productivity probably drops faster than a constant 11%. I also think that there are really multiple factors here: output is a combination of creation (which is what we're looking at here) and destruction (negative creation -- in the case of programming, the creation of additional work by writing bad code, defect insertion, and so on). I think creation probably drops a bit more slowly than we're looking at here and destruction rises much more rapidly.
All that said, however, I think this table is pretty true, and I think it points out the essential correctness of my gut feeling: 50 hours a week doesn't cost you too much overall and can boost your short-term creation (get those features in before the milestone!) without too big a later hit caused by your inserting bad code, bugs, and making bad decisions that have to be re-made later.
Comments