Tech shifts that shook up the status quo and left the experts behind

Sometimes you assume that a technology has reached its pinnacle. Usuallly you're wrong.

thomas_watson.jpg

"I think there is a world market for maybe five computers," would in retrospect have been a pretty funny thing for IBM CEO Thomas Watson to have said 1943. He never said any such thing, as it happens, but the urban legend that he did is very persistent, in part because of what we know about how technology advances: we know that what seems reasonably stable and fixed today can be swept aside as inadequate tomorrow, and what seems impossible today will be commonplace in the future. This slideshow examines moments in tech history when the consensus that we had reached a high point or logical conclusion became suddenly, hilariously incorrect.

This slideshow originally appeared on ITworld.com.

Credit: Poor, misquoted Bill Gates. REUTERS/Denis Balibouse
More kilobytes, sure -- but do we really need more bits?

One of the most famous laughable-in-hindsight predictions from a tech luminary is Bill Gates' famous pronouncement on the release of the IBM PC that "640K ought to be enough for anybody." The only problem is that Gates swears he never said it, and likes to dig up well-sourced quotes from the same time period proving that he believed just the opposite. But a similarly short-sighted quote from him does seem to be genuine: "We will never make a 32 bit operating system" he declared in 1983 at the unveiling of the MSX, an 8-bit Z80-based reference machine Microsoft helped develop. The MSX flopped, and the 32-bit Windows NT 3.1 arrived a decade later.

"Home computer" can mean more than one thing

At the 1977 World Future Society convention, DEC founder Ken Olsen definitely said the seemingly boneheaded quote he's famous for: "There is no reason for any individual to have a computer in his home." But the context is crucial: he wasn't talking about PCs (which DEC was already selling), but rather about a central computer that would automate utilities, lights, and other aspects of the home, which futurists had been predicting as coming any day now since the 1950s. He was right then -- such systems seem futuristic but are generally pointless and clunky -- and still is today, almost 40 years later -- although with Google's acquisition of Nest, that may be about to change.

It can be done -- but can we afford it?

Sometimes even visionaries can't see where their own inventions will get up to. In this fascinating 1981 Christian Science Monitor article about the nascent cellular phone industry, Marty Cooper, the man who essentially invented the cell phone, declared that "Cellular phones will absolutely not replace local wire systems." His doubts were not about the technical capabilities but about cost: "Even if you project it beyond our lifetimes, it won't be cheap enough." Cooper is still going strong today at 85, and has lived long enough to see approach of the tipping point when wired telephony networks become legacy technology.

Even you can't have believed this

In 1933, Boeing could be justly proud of the brand new Boeing 247, which featured a host of innovations such as all-metal construction and retractable landing gears. It was the first plane that could get passengers from one end of the United States to the other without needing to change planes, though refueling stops meant that the whole process took 18 hours. Still, for the Boeing chief engineer to say "They'll never build them any bigger" as the first model was wheeled out took a certain chutzpah. The airplane seated 10 passengers and one flight attendent. The DC-3, which entered service two years later, was bigger.

If you build it, they will watch

You'd think that someone in charge of major entertainment conglomerate would know something about humanity's ability to consume mindless TV by the gallon, but in 1994 Viacom CEO Sumner Redstone wasn't convinced. "I will believe the 500-channel world only when I see it and when someone explains to me what's going to be on it," he said in a speech about emerging network technologies. Clearly Redstone knew it was technically possible, but doubted the economics of it. 20 years later, we're here to tell him what's on the 500-channel set: reality TV. Lots and lots of reality TV.

When "good enough" suddenly isn't

In 2009, Wired published a long, glowing article about "good enough" technology -- tech that isn't necessarily the latest and greatest, but that does one job simply, cheaply, and well. The centerpiece of the article was the Flip Ultra, a cheap HD video camera that had taken the world by storm by being good enough. But only two years later, Cisco, which had spent almost $600 million on the company, shut it all down, as video enthusiasts were discovering that the smartphones they already owned were also good enough. (Flip loyalists had difficulty with the decision's logic.)

Credit: Not so bankrupt anymore, eh, Apple? REUTERS/Stephen Lam
The second coming of subscription music

In 2003, you could forgive Steve Jobs for being a little smug about the model Apple had come up with for the iTunes store, selling music song by song and file by file. "The subscription model of buying music is bankrupt," he declared to Rolling Stone, and given that he'd sold millions of songs in less than a year and subscriptions services numbered customers in the thousands, he seemed right. But as wireless connectivity became more and more omnipresent, streaming services like Pandora, both paid and ad-supported, became more popular and profitable. Last year Apple threw in the towel and launched its own streaming service, iTunes Radio.

Credit: Hard to make a cute graphic out of "utility." REUTERS/Fabrizio Bensch
What's in a name? Sometimes, success

In 15 years in tech journalism, I watched many companies try to drum up interest in the idea of services running on centralized servers and accessed by thin (or thinner) clients. The ideas went by a lot of names -- network computing, utility computing, grid computing, [fill in the initial here]aaS -- but the concept generally flopped, or at least was restricted to specific niches. But about five years or so, a new term arrived the covered many of the same concepts -- cloud computing, or, just "the cloud." (Here's IBM developerWorks trying to explain the differences in 2009.) Whether we were just waiting for the right metaphor or for better network connectivity, it's undeniable that the concept has truly arrived.

The End of the World that never comes

And sometimes, it's not technology that gets outpaced, but its creators. Bob Metcalfe, inventor of Ethernet and founder of 3Com, was one of the pioneers who made the Internet possible -- but doubted its ability to scale. In a 1995 InfoWorld column, predicted a catastrophic network failure, which he called a "gigalapse," that would wreck the nascent Web. (The original column seems to have vanished, though he rehashed the arguments for Network World the next year.) But the Internet turned out to be stronger than one of its creators imagined, and Metcalfe dutifully ate his words -- in the form of a slurry he made by putting his print column in a blender -- at 1997's World Wide Web Conference.