I just read an article by Joel Spolsky on multi-tasking and computers. He really opened my eyes to the downside of the law of exponential growth.

For years, yes years, I thought I was getting more done by doing multiple things at once. Seems efficient, right. WRONG! It really does take more time to do multiple tasks then to do one task at a time.

Now, Joel is a computer guy—develops software. He has a great analogy that even a non-computer person can relate to, so I will paraphrase it here-

“…You have two computations to perform, A and B. Each computation requires 10 seconds of computer time.

You can either do computation A to completion, then B (one after the other) or you can multi-task.

For the sake of this argument, let’s say, if you ask the computer to multi-task, having task A run one second, then switch to task B for one second, then switch back to task A for one second… and the task switch will take no time at all. Now, we all know that never happens when we perform tasks, getting “back up to speed” does take time, but play along…

In both cases, you have to wait 20 seconds to get both of your answers to A and B. But think about how long it takes to get the results to each (A and B) computation.

With multitasking, the results take 19 seconds to arrive… yet with sequential processing each result, A and then B are ready in only 10 seconds each.

He takes his computations further and assumes each task switch would take one-minute. So, the computer would use 80 seconds when doing A and then B sequentially (20 seconds for the two, A and B tasks and 60 seconds for the switch)… a little over a minute.

But this is what happens in multi-tasking, add the 20 separate one-second tasks and 19 one-minute switches. (20 seconds + 19 task switches = 1160 seconds) That’s OVER nineteen minutes!"

Consider the exponential growth factors… Enough said! No more multi-tasking for me.