The Speculist: Taking Less from Moore

logo.jpg

Live to see it.


« 2200 Will Never Come | Main | Friday Video »


Taking Less from Moore

Back in 1987 computer shoppers were conditioned to spend about $2000 for a new computer. Buying a computer was a major purchase. It was the sort of thing that you did once for your kid as he went off to college. You expected the student to use the machine all four years of college and maybe afterward.

That's what my parents did for me as I packed up for college that year. Five years later I replaced that machine with another desktop that was much more powerful (it ran Windows!) for about $1,800.

I benefitted from something I'll call "Moore's Dividend." Moore's law predicts that "the cost of a given amount of computing power falls by half every 18 months." That meant that in 1992 I could have:

a) paid much less for the same computer power as I had originally taken to college, or

b) spent the same money ($2000) for much more computer.

I went with option B. Mostly. Options "A" and "B" are really two ends of a continuum. I was closer to the "B" side, but I also spent $200 less.

And that's the way it went. Every couple of years I'd buy a new machine. Each would be much more powerful but would cost about $200 less. I and most other computer shoppers tended to take most of the Moore dividend in added computer power. We did this because greater power added functionality. We also felt compelled to do this because our operating systems and software grew larger regardless of functionality. We felt we had to grow to keep up.

On January 15th The Economist had an article on the rise of the cheap sector of the computer business. Netbooks, of course, are the most obvious sign of this, but there are other indicators that computer shoppers are taking their Moore dividend in cash rather than computer power.

Even Microsoft has cottoned on: the next version of Windows is intended to do the same as the last version, Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version.

The Economist blames the recession for this, but my question is: is going small a bad thing? I've long had a fascination for the low end of the computer revolution. I think its a good thing when a kid with just a few bucks can cross the digital divide. I'm looking forward to a future landmark: the day when McDonalds offers a Mclaptop for $10 with the purchase of a Happy Meal.

I bought a netbook this Christmas for my kids. It's no toy: it can handle word processing, spreadsheets, email, browsing, iTunes, and simple games without difficulty. I'm sure I could use it to run the switchboard for the FastForward Radio show. For many people, perhaps most, its all the computer they need. That's the upside: almost everyone in the developed world can now get their hands on a "good enough" computer.

If there's a downside its this: if there were killer ap that required more computer power, many people would find that they "need" more computer. The lack of a killer ap is fueling the race to the bottom more than the recession.

Case in point - Vista underperformed in the market, in part, because it offered the normal new OS bloat without a significant increase in functionality. Now, since the next killer ap has yet to arrive, the next Windows will shave the bloat. Its probably a smart move on Microsoft's part, but I'm wishing we had the killer ap.

And what would that be? I'll wager its either AI-driven personal digital assistants or virtual worlds will come of age.

Comments

The main reason people will take that Moore dividend in cash is the physical limits of humans not of the computer.

It is the law of diminishing returns.

First speed. If the computer appears to respond instantly the human regards that as fast enough. It will not matter if that response came in 50 microseconds or 50 nanoseconds.

People have only two eyes. They will not buy ever larger monitors or more multiple monitors. There is no more point in doing so than in keeping several enormous reference books open at the same time.

A scholar might do so. Readers at home do not.

People can deal with multiple tasks, but there is a limit. Beyond that they become confused and ineffective.

They can use a mouse and keyboard at limited speed. The limit is not only tactile but in deciding what to type or click.

Even speech recognition must be limited by the rate at which people can decide what to say.

For now, modest home computers can outperform the demands of most users. So there is little reason to buy more power.

Reduced size and cost and simplicity will sell.

Nonsense like the need to defrag, backup, and constantly upgrade security software need to end.

And driver, graphic, and audio formats should not plague the user. Wi-fi and home networking should be simpler.

Establishing private networks with friends should be trivial.

None of these desirable things involve hardware limits.

A team at M.I.T. (youngsters rebelling against Nick Negroponte's OLPC $100 project) are cloning the Apple II.

http://www.macobserver.com/article/2008/08/05.11.shtml

Goal is that any kid in the world can leap forward into the late 20th century for about $12.

I'll try this again.

The human is now the limiting factor for personal computing. He can only see, read, decide, type, and click at a limited rate.

Once displays are totally clear and large enough there is little incentive to upgrade them.

i.e. the 22 inch monitor is as useful as the 28 for almost everyone.

Once response time is fast enough a faster response means nothing.

i.e. a response in 10 milliseconds seems just as fast as 10 nanoseconds.

Specialists will use the best. Most people will buy cheap and adequate.

The things that are needed now are simpler software, less maintenance such as defragging, and better battery life.

And a better interface. Voice recognition that really works well, and someday perhaps mouse control by glance and/or thought.

But whatever that killer app is or will be - it will be on the net. Or on the web. In the cloud.

And so all we need is storage, networking and an interface.

I use several spreadsheets on a regular basis. One of them is pretty deep. I imported it into Google docs - no problem. And now I drive around it on the web.


Post a comment

(Comments are moderated, and sometimes they take a while to appear. Thanks for waiting.)






Be a Speculist

Share your thoughts on the future with more than

70,000

Speculist readers. Write to us at:

speculist1@yahoo.com

(More details here.)



Blogroll



Categories

Powered by
Movable Type 3.2