Back in 1987 computer shoppers were conditioned to spend about $2000 for a new computer. Buying a computer was a major purchase. It was the sort of thing that you did once for your kid as he went off to college. You expected the student to use the machine all four years of college and maybe afterward.
That’s what my parents did for me as I packed up for college that year. Five years later I replaced that machine with another desktop that was much more powerful (it ran Windows!) for about $1,800.
I benefitted from something I’ll call “Moore’s Dividend.” Moore’s law predicts that “the cost of a given amount of computing power falls by half every 18 months.” That meant that in 1992 I could have:
a) paid much less for the same computer power as I had originally taken to college, or
b) spent the same money ($2000) for much more computer.
I went with option B. Mostly. Options “A” and “B” are really two ends of a continuum. I was closer to the “B” side, but I also spent $200 less.
And that’s the way it went. Every couple of years I’d buy a new machine. Each would be much more powerful but would cost about $200 less. I and most other computer shoppers tended to take most of the Moore dividend in added computer power. We did this because greater power added functionality. We also felt compelled to do this because our operating systems and software grew larger regardless of functionality. We felt we had to grow to keep up.
On January 15th The Economist had an article on the rise of the cheap sector of the computer business. Netbooks, of course, are the most obvious sign of this, but there are other indicators that computer shoppers are taking their Moore dividend in cash rather than computer power.
Even Microsoft has cottoned on: the next version of Windows is intended to do the same as the last version, Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version.
The Economist blames the recession for this, but my question is: is going small a bad thing? I’ve long had a fascination for the low end of the computer revolution. I think its a good thing when a kid with just a few bucks can cross the digital divide. I’m looking forward to a future landmark: the day when McDonalds offers a Mclaptop for $10 with the purchase of a Happy Meal.
I bought a netbook this Christmas for my kids. It’s no toy: it can handle word processing, spreadsheets, email, browsing, iTunes, and simple games without difficulty. I’m sure I could use it to run the switchboard for the FastForward Radio show. For many people, perhaps most, its all the computer they need. That’s the upside: almost everyone in the developed world can now get their hands on a “good enough” computer.
If there’s a downside its this: if there were killer ap that required more computer power, many people would find that they “need” more computer. The lack of a killer ap is fueling the race to the bottom more than the recession.
Case in point – Vista underperformed in the market, in part, because it offered the normal new OS bloat without a significant increase in functionality. Now, since the next killer ap has yet to arrive, the next Windows will shave the bloat. Its probably a smart move on Microsoft’s part, but I’m wishing we had the killer ap.
And what would that be? I’ll wager its either AI-driven personal digital assistants or virtual worlds will come of age.