- 18 Hot IT Certifications for 2014
- CIOs Opting for IT Contractors Over Hiring Full-Time Staff
- 12 Best Free iOS 7 Holiday Shopping Apps
- For CMOs Big Data Can Lead to Big Profits
Network World - Just over a year ago I wrote about something that was incredibly exciting: A commercial cold fusion power generation system. Assuming it worked as claimed, the cold fusion power generation system would herald huge changes in not only the energy industry, but pretty much every aspect of the global economy.
In IT, practical cold fusion generators could, in theory, power entire data centers for next to no cost and provide power in locations far from the grid. And the existing power grid would, itself, be potentially obsoleted by fusion power generation. As for the impact on transportation ... imagine a car that could be driven from coast to coast several times without refueling for a few cents! That's the promise of cold fusion power.
Note that cold fusion is often termed "Low Energy Nuclear Reaction" (LENR) these days, but I'll stick with cold fusion for this article.
Before I update you on where we've got to in this story, let me first explain the background for those of you who might not be up on the story. Those of you who are "au fait" may care to skip ahead.
The generation of energy by fusion is based on the theory that if you can persuade the nuclei of atoms to "fuse" together such that a new, heavier nucleus is formed you will generate energy ... a lot of energy. The reason for this output is that some proportion of the mass involved in fusion is converted to energy.
Now, there are two ways, at least in theory, to achieve fusion. The most publicized and the technique with vastly more research dollars attached to it is "hot" fusion. Hot fusion attempts to emulate the conditions found in stars and so involves temperatures and pressures that are simply mind boggling.
Hot fusion is the goal of projects such as the National Ignition Facility which, along with the likes of the Large Hadron Collider, are fine examples of "big science." The NIF has cost, so far, in excess of $3.54 billion (the LHC is even more spendy, with a price tag of "$9bn ... as of Jun 2010".
Hot fusion is, if you're a geek, sexy. It involves enormous machines the size of houses, enough power to run a large city, and swarms of lab-coated acolytes to prepare and run the equipment which, to date, has completely failed to generate more power than is put into it, which is the goal (called "over unity").
Cold fusion, on the other hand, is theoretically, fusion that can occur at "normal" temperatures (i.e. room temperature ... although I guess where you set your thermostat makes that a little vague) and "normal" pressures.
The concept of cold fusion goes back to the 1920s, but the general public really only became aware of the idea in the late 1980s when two respected electrochemists, Martin Fleischmann of the University of Southampton and Stanley Pons of the University of Utah, announced they had detected "anomalous heat production" in a laboratory setup orders of magnitude simpler than the equipment used by today's hot fusion researchers.
Alas, the results of Fleischmann and Pons (left in photo) experiment proved very difficult to replicate and the entire cold fusion field fell into disrepute. Moreover, those who gave the concept of cold fusion any credence after the discrediting of Fleischmann and Pons, were ridiculed and ostracized. Even being interested in cold fusion could potentially end an academic science career.