How web 2.0 will cost us money

I'm writing this on a 1GHz G4 Ti PowerBook with 1Gb RAM. It's so old, I finished paying for it some time ago.

Until about a year ago I had no desire to upgrade: I don't edit video, I rarely gimp huge images. It's travelled half-way around the world with me (and survived the plunge out of a 747 overhead locker with great aplomb). The machine compiles code of the size I can write by myself fast enough for me not to care about it, it can even do that while playing back music. It can just about drive a 1680x1050 monitor (so long as noting too visually exiting happens). But these days, browsing the web with this machine is increasingly painful as the 2.0 sites get more and more JavaScript intensive, and as that trend spreads to more and more sites. Try doing some general browsing with JS turned off and see how many plain old websites--not a social tag clound in sight--just don't work at all without it. This is a sad state of affairs.

I might add that editing this blog posting is slightly more painful than I'd like, firefox's cpu usage is peaking at about 50%, which is ludicrous.

When I started my professional programming career I was well pleased to have a SPARCStation 5 as my desktop machine. Check out those numbers: 110 MHz! You'll still see those boxes occaisionally today (they're very well built), hidden away in data centers running some admin demon or other. For a while Sun sold headless SS5's as web servers, imagine that. At the time the thought of a "super-computer" grade laptop like the pb I have here would have been laughable. And now it's a crying shame that all this capacity is being burned up in the name of this sort of thing (95% CPU), clever as it is. Which is why I salute Ted for this little investigation, and find this survey of the art rather dismaying for its implications.