Are We Alone – Do Computers Byte?

by Gary Niederhoff on January 4, 2011









The march of computer technology continues. But as silicon chips and search engines become faster and more productive – can the same be said for us?

The creator of Wolfram Alpha describes how his new “computational knowledge engine” is changing – and improving – how we process information. Meanwhile, suffering from data and distraction burnout? Find out what extremes some folks take to stop their search engines.

Also, the Singularity sensation of humans merging with machines… and, why for the ancient Greeks all of this is “been there, done that.” A deep sea dive turns up a 2,000 year old computer!

Listen to individual segments here:
Part 1: Jo Marchant on the Antikythera Mechanism.
Part 2: Stephen Wolfram on Wolfram Alpha.
Part 3: Fred Stutzman on “Freedom”.
Part 4: Peggy Orenstein on distraction.
Part 5: Ray Kurzweil on the singularity.

{ 1 comment… read it below or add one }

avatar Stephen January 19, 2011 at 1:37 pm

I built a computer in 2002, and benchmarked it. I built another in 2010, spending a bit more money. The new system is only about 8 times faster. But the new system has 4 CPU cores, and each core is only about twice as fast as the single 2002 system. Worse, cores this fast were available in 2002. Computers are getting faster, but not as fast as they used to get. There are a couple reasons for this.

Chips were big enough for the first computer-on-a-chip in the early 1970s. As chips got bigger – more parts – there were obvious things that the increased part counts could be used for. Barrel shifters for arithmetic, and so on. Consolodating parts of the computer system onto a single chip was a big deal because inter-chip communication is slower than communication within a chip. And as chips got bigger and bigger, more of these obvious tricks were used. The chips themselves improved – parts became smaller, and better materials – like copper wires instead of aluminum, lead to improved speed. But in both cases, the easy tricks to try have been done. And now, the larger parts counts are being used to produce more CPU cores. Each core isn’t much quicker. We’ve more or less reached the limit called the Von Neumon bottleneck – the time it takes to fetch data from memory, and we’re reaching quantum limits on how small the parts on a chip can be. So, the exponential speed improvements are slowing. That suggests that Ray’s singularity will have to wait.

But there’s another issue. Software. If no one is working on creating smarter software, then we’ll just have faster dumb computers.

Of course, it may be that we’ve had computers fast enough for self awareness for ages. One estimate is that the human brain can store perhaps a terabit of information. My 2002 computer has a terabyte (8 times bigger) disk drive. It may be slow, but it should be able to at least remember everything that i can. Maybe more. It can certainly do arithmetic faster than i can (even with my abacus). Maybe Stephen Wolfram’s work lead to self aware software.

I wish him luck, and welcome our new overlords.

Leave a Comment

This blog is kept spam free by WP-SpamFree.

Previous post:

Next post: