Friday, February 29, 2008

MELTDOWN - What went wrong with nuclear power?

From GQ Magazine.

What went wrong with nuclear power? How did the cleanest, cheapest solution to our oil dependence become the stuff of apocalyptic nightmares? Where does the myth end and the truth begin?

On the morning of March 28, 1979, just after 4 a.m., a pair of workers on the graveyard shift at Three Mile Island nuclear plant noticed something odd: The filters for the system’s cooling water had spontaneously shut down.

This was troubling, but far from catastrophic. Although the flow of water is essential to a nuclear plant—it keeps the radioactive fuel from overheating—TMI had been designed to handle just this type of problem. In the event of a filter shutdown, a bypass valve was engineered to open automatically, sending water around the obstruction and directly into the plant.

Unfortunately, at TMI that morning, no sooner had the filters broken than the bypass valve broke, too.

As the supply of coolant came to a stop, the plant’s internal temperature began to soar. Luckily, TMI had been designed to handle this problem, too. Within minutes, a second emergency valve popped open, venting the excess heat and steam into the containment facility.
Unfortunately, ten seconds later, that valve broke, too—staying open long after the pressure was reduced, and allowing steam and water to continue flooding out of the plant.

Now TMI was in a double bind. On the one hand, the supply of fresh water had stopped; on the other, the plant’s existing water was quickly flowing out. It was only a matter of time before the plant would be completely drained, and the uranium inside, exposed to the open air, would begin to burn furiously, becoming impossible to control, a condition known to the industry as “meltdown,” in which, like a slow-motion nuclear bomb, the walls of the plant would incinerate and collapse, and a black cloud of radioactive poison would alight above the eastern seaboard.

There was only one glimmer of hope: the plant’s operators, who had decades of experience and, with quick action, could almost certainly slow the damage—either by pumping additional water into the plant from another source, or by closing the open valve.

Unfortunately, the operators did neither.

Inside the plant’s cavernous control room, they stood before a seven-foot-high instrument panel, crammed with hundreds of -dials, gauges, and meters, many of which were erupting with alarm sirens and flashing lights, but none of them had any idea what it meant. As the minutes turned into hours, and the plant’s internal temperature soared above 1,000 degrees Fahrenheit, then 1,500 degrees, with water levels plummeting and the prospect of meltdown becoming more likely by the second, the confused operators did nothing to improve the situation. In fact, they made it worse. When one of the plant’s emergency systems began replacing water at a rate of nearly a thousand gallons per minute—which probably would have stemmed the crisis—operators turned the pumps off. Then, for reasons that are difficult to comprehend, they shut down the plant’s circulating pumps, preventing the cool water still inside the building from reaching the superheated core.

It would be another two hours before the morning shift arrived and stopped the leak.

By then, it was too late: The temperature inside the reactor had risen to more than 4,000 degrees Fahrenheit. The heat had caused the top half of the radioactive chamber to collapse on itself. The uranium fuel had melted into a lake of radioactive liquid. The zirconium skin around that fuel had evaporated into a broth of volatile gases, including a giant cloud of hydrogen that would soon explode into a fiery ball. And the nation, still asleep in the early-morning hours, was spiraling into the worst nuclear accident in its history, a cataclysm of fear and public mistrust that would consume the airwaves and charge the political dialogue for months; that would bring white-suited emergency-response crews to the site and would summon the president of the United States to visit in a pair of bright yellow protective booties, ordering a full investigation; that would take ten years and a billion dollars to clean up, would destroy the nation’s faith in atomic technology, and would bring the hope and promise of nuclear power—promoted since the 1950s as a clean and plentiful source of energy that would be “too cheap to meter”—to a grinding, glowing, terrified halt.


Today, the cooling towers at Three Mile Island still loom above the rolling Pennsylvania landscape like four giant spacecraft descending upon the American pastoral. At nearly 400 feet tall and made of enough concrete, it is said, to pave a four-lane highway to Baltimore, they dominate the view in almost every direction: As you head north along the Susquehanna River, they blot out the afternoon sun; going south toward the local airport, they seem to barricade the end of the road; even looking up to the sky, the thick black high-tension wires and streams of manufactured clouds that spew from the still-working portion of the plant turn the sky into an ugly industrial stew. Standing on the far shore of the river and looking over at TMI, one can easily understand why opponents of nuclear power so often distribute pictures of the plant, a stark visual shorthand for an industry run amok.

Up close, however, the feeling can be different. One morning last February, I stood on the banks of Three Mile Island with the plant’s longtime spokesman, Ralph DeSantis, gazing up at the towers from below. Despite their ominous appearance, they had an almost elegant function: Their only purpose, it turned out, was to create an immense chimney where warm water could fall through the air, cooling as it went. From the ground, we could actually see this happening. Peeking through a series of slanted baffles at the base of the towers, we could peer into the open chamber, where thousands of gallons of water cascaded down the sides. The sound was deafening, like a massive waterfall.

“Overwhelming, isn’t it?” DeSantis shouted over the noise. “Believe it or not, that structure holding the baffles is made of redwood. When the water hits each baffle on the way down, it gets broken up into beads, which helps it cool.” He pointed to a building not far away. “And that’s where the circulating pumps are. The pipes are six feet in diameter! It’s a plumber’s nightmare.”

Like many of the nation’s nuclear workers, DeSantis was a local boy. He had left for college in the 1970s but soon found himself back home and bored, struggling to impart the nuances of civics to schoolkids who could barely read. By early 1979, he had quit teaching for a security job at TMI, but just a few weeks later, when the plant melted down and news crews came swarming, DeSantis jumped at the chance to be part of something bigger. He offered to help with public relations, and within days he was ushering public officials and news celebrities around the site, brushing shoulders with people like Andrea -Mitchell and Diane Sawyer. In the thirty years since, his profile at the nation’s most notorious power plant has led to countless job offers in the PR world, but DeSantis has been happy to stay, becoming a kind of institution at TMI, a model of loyalty in an environment with no greater currency. As we walked from the cooling towers toward the reactor and then throughout the TMI complex, he had the air of a jolly small-town mayor, calling out to nearly everyone we passed and stopping to chat with more than a few.

Even so, as we crossed the plant’s expansive grounds, it was difficult to ignore the presence of danger around us—or anyway, the heightened and elaborate defenses against it. Since September 11, the nation’s 104 nuclear plants have become an object of much attention from homeland-security experts and terrorists alike, and security has skyrocketed, at a cost of more than $1.5 billion so far. TMI was no exception. Where only a few years ago employees could practically drive to the front door and park for the day, the parking lot had been Hail Mary’d several hundred yards away, and the long walk between had been repopulated with a series of three-foot-thick concrete walls—reinforced by two-inch-diameter rods of steel—and surrounded by skeins of razor wire, bulletproof sniper towers, and militarized sentries armed with machine guns. “It’s almost like a canal system of locks and dams,” DeSantis explained as we stepped into what looked like an oversize dog kennel. “When one gate is open, the other is closed. While the person is detained, we can do inspections and explosives detection, and if there’s any problem, there’s a Patriot Gate that rises out of the ground on both sides, which is just as strong as the concrete.” Even the concrete has been tested for impact: A video that circulates among TMI employees shows a pickup truck plowing into the plant’s outermost barrier and collapsing like an accordion. The barrier remains unblemished.

When we reached the entrance to the plant’s guard station, DeSantis and I surrendered our driver’s licenses in exchange for radiation monitors to hang around our necks, crossed another gauntlet of security locks enclosed in bulletproof glass, stepped into a machine called a Puffer, which blew air across our bodies and sniffed it for explosives residue, and then passed through a threshold of metal detectors before emerging into a courtyard filled with yet more razor wire and paramilitary guards, where, on the far side, we finally reached the door to the plant itself.

The inside was like nowhere else in the world. It is tempting to say that if you were to wake up inside, without ever having seen a power plant, you would know instantly where you were. Pipes the diameter of a Volkswagen bus and painted in glossy primary colors stretched along the walls and the ceiling, springing into the room at ninety-degree elbows before shooting upward to the floor above or down to the one below. Hoses the size of anacondas coiled their way around corners and over door headers, and stop valves that looked like nautical steering wheels were strapped to the walls with tags to identify them. Everything was polished and reflective under bright lights, and the air seemed to shiver from the pipes’ vibrations. It was like being trapped inside a giant air conditioner.

We climbed an open metal staircase that stretched between the pipes and machinery and followed catwalks to look around. Virtually everything around us related to water: tanks so enormous that the curve of the cylinder was nearly imperceptible, filters capable of purifying thousands of gallons at once. If the Hollywood depiction of a nuclear plant involves zones of exposure and pervasive risk, where workers live in fear of radiation—think The China Syndrome or Silkwood—life inside a real power plant was startling proof of what actually drives a nuclear plant: water. Except for the presence of uranium in a single room, the rest of a nuclear compound is essentially a giant steam engine, with three circuits of water doing virtually all the work.

The first of these water circuits, known as the primary, is the only one that actually touches radioactive fuel. In the earliest stage of the process, this water is channeled through a series of tall, thin tubes of uranium known as fuel rods, which are extremely hot from their natural decay process. It takes only a few seconds for the heat from the uranium to make the water hot, too. Once the water reaches 212 degrees, it must be pressurized to avoid boiling; at 600 degrees, under 2,000 pounds of pressure per square inch, it is ready to be pumped into the next stage of the process, known as the steam generator.

Here, the superheated primary water is brought into contact with a secondary circuit of cooler water. Since the pipes of the two circuits are allowed to touch but the water inside them is not, only the heat can be transferred, and none of the radioactivity. As the secondary circuit absorbs this heat, it boils into steam, which is piped into a series of giant fan blades known as turbines. The force of the steam blowing through those fans causes the blades to spin. If Three Mile Island were a steamboat, this would be the final result—the spinning fans would rotate a paddle wheel and push the boat across the water. Instead, at a power plant, the spinning motion is used to rotate a giant coil of wire inside a magnetic field, creating a current of loose electrons.

Presto: nuclear power.

All that remains is to cool the water and start over. For this, a third circuit of water is pumped into the plant directly from the Susquehanna River, absorbing heat and then flowing back outside to the cooling towers and into the river below. Since, at least in theory, this water never comes into contact with anything inside the plant except its own pipes, the warm water returning to the river should be no more or less polluted than when it was first pumped out. Likewise, the evaporative clouds billowing from the tops of the cooling towers, which appear so grimy in photographs, are actually no different from the clouds forming naturally above the river. In fact, those billowing clouds, which even some nuclear workers casually refer to as “smoke” or “steam,” are actually neither. Like a man’s breath on a cold day, they’re mostly water vapor and tend to fade or even disappear in warm weather. True steam, by comparison, is the more sophisticated substance—entirely gaseous and devoid of humidity—that powers a plant’s turbines, and this almost never stops blowing, especially at TMI. Over the past ten years, the plant has become famous for its constancy, setting records for continuous operation. The latest, among more than 250 similar reactors worldwide, was 689 days without pause or fail.

What all this amounts to, in a typical year, is about 7.2 million megawatt hours of electricity, or enough to satisfy the needs of 800,000 homes. By way of comparison, to produce the same amount of electricity, a coal-fired power plant would have to incinerate more than 3 million metric tons of fuel, producing 500 pounds of carbon dioxide per second, as well as 1,200 pounds of ash per minute and 750 pounds of sulfur dioxide every five minutes. Looking at the cooling towers with that in mind, where a smokestack would be at any of the nation’s 600 coal plants, it is easy to appreciate the lure of nuclear power: The carbon footprint of a nuclear plant is precisely…nothing.

After a tour through the rest of the plant—including the control room, where -operations are monitored twenty-four hours a day, and the turbine chamber, where the eighteen-foot turbines spinning at 2,000 rpm generate a dizzy, strangely exhilarating high—DeSantis took me outside for a peek at the rest of the island.

Although the TMI plant occupies less than half of Three Mile Island itself, for security reasons the entire landmass is kept under armed guard, creating a de facto wildlife preserve around the facility of about 400 acres. As we entered this small wilderness, the transition from extreme industry to -extreme nature was instantaneous. If the popular image of Three Mile Island has become a Guernica of toxic devastation, the island itself was a bucolic rebuttal: Redbud trees and sycamores abounded among stands of native switchgrass, and plumes of second-year mullein stood as stiff and straight as uranium fuel rods. The sound of the river was punctuated only by the songs of birds. “We had osprey nesting here two years ago,” DeSantis said, “and there’s a pair of peregrine falcons nesting by the reactor building right now—I understand it’s one of only ten nesting pairs in Pennsylvania. It’s cool when you see them teaching their babies to fly. We also get Canadian geese and foxes and deer, and some of the local ornithologists think the bald eagles are going to nest soon, too—they’re already around.”

To DeSantis, there was nothing improbable about any of this, finding a wildlife sanctuary on the site of the nation’s worst nuclear meltdown. The accident, for him, was a long-distant nightmare, an aberration, dwarfed and marginalized by three decades of clean, safe power without a single accident. “I wish I could bring more people here to see for themselves,” he said. “Sometimes we bring reporters, but they only want to talk about the accident. They only care what happened thirty years ago.”

DeSantis looked mystified.


To understand why TMI still resonates so powerfully in American life, and our public policy, it’s helpful to separate the accident’s impact into two distinct categories: the -radiological fallout, and the psychological.

How much fallout was there?

From a radiological standpoint, the impact is somewhat easier to measure. Radiation is counted in units called millirems. Because the earth is warmed by the largest nuclear reactor of all—the sun—virtually all of us are exposed to a certain baseline of millirems each year, depending on where we live. At higher elevations, like Denver, the sun is closer, and citizens receive about 180 millirems per year; at lower elevations, like Delaware, residents receive only 20 or 30. Building materials can also make a difference: Because of the presence of -elements like radon in many rocks, people who live in brick or stone houses tend to receive 50 or 100 millirems more each year than people who live in wooden houses. Region also has an impact. In areas rich with coal, residents absorb about 100 millirems annually from the ground; in northeastern Washington State, residents get about 1,500 millirems from local minerals; in certain parts of India, as much as 3,000 millirems per year may come from the ground. A cigarette smoker gets about 1,300 millirems per year, mostly from the presence of the radioactive isotope polonium-210, which is found in tobacco (and, recently, in the autopsy reports of Russian spies). Back home in Washington, D.C., Dick Cheney gets about 100 millirems every year from his pacemaker. Every time you go in for a dental X-ray, you get 5. Chest X-ray: 15. PET scan: 650. In fact, just by being alive, you generate a little radiation of your own, and most people absorb about 40 millirems each year from themselves.

At Three Mile Island, according to a 1980 inquiry by the Nuclear Regulatory Commission, the maximum level of radiation that anybody within a fifty-mile radius could have received from the accident was about 100 millirems—the equivalent of moving to Colorado for a year, or into a brick house for two. According to another study, by the Pennsylvania Departments of Health and Environmental Resources, among 721 locals tested, not a single one showed radiation exposure above normal. A similar study by the state’s Department of Agriculture found no significant trace of radiation in the local fish, water, or dairy products, which tend to register minute impurities. And a study released in 2000 by the Graduate School of Public Health at the University of Pittsburgh found that, twenty-one years after the accident, there was still no evidence of “any -measurable impact” on public health.

Given the extreme scale of the meltdown at TMI—including an explosion of hydrogen, the liquefaction of radioactive uranium, and the release of a plume of radioactive gas into the air outside—it is reasonable to conclude that the lesson of Three Mile Island is not merely a matter of what went wrong at the plant but also an example of what went right. For so many people and so many systems to fail so spectacularly all at once, without any measurable effect on public health, may be the last, best proof that a system is working.

Still, for nearly thirty years, the psychological fallout from TMI has metastasized into something much more difficult to measure or explain. In the aftermath of the meltdown, the number of Americans who support nuclear-power plants has dropped from a high of 70 percent before the -accident to around 40 percent, and today one in ten people would like to eliminate the nation’s fleet of nuclear plants entirely. What drives this opposition, in many cases, is the conflation of magnitude with probability. That is, when people worry about nuclear power, what they worry about is the scale of an accident, not the likelihood. In this regard, nuclear power is just the opposite of the nation’s coal-fired plants, where harm to the environment is both ruinous and certain but comfortingly slow. It may take decades or even centuries for the effects of particle soot, acid rain, and global warming to claim a million lives. By contrast, the nightmare scenario with nuclear power is decades of cheap, plentiful, pollution-free energy—followed by a sudden meltdown that wipes out a city. For most people, the reality that coal-based pollutants like mercury and sulfur dioxide are killing us every day—taking as many as 24,000 lives per year, according to nonpartisan researchers (that’s a Chernobyl disaster every eleven hours), while nuclear plants have never claimed an American life—is beside the point. The image of a city disappearing in a nuclear haze, however improbable it may be, trumps everything else. Many people, according to polls, not only oppose building new nuclear plants; they oppose the ones we already have. Unfortunately, since nuclear energy currently makes up about 20 percent of the nation’s electrical supply, in order to eliminate it, the rest of the nation’s power suppliers would have to amplify their own production by 25 percent of existing levels. Since that’s not possible for most current renewables—like wind, solar, and hydroelectric farms, which are already maxed out—the real cost of eliminating today’s nuclear-power supply would be an immediate 30 percent increase in the nation’s coal, gas, and oil plants. That’s 30 percent more sulfur dioxide, mercury, and nitrogen oxide in the air than we’re emitting today. Also, since those plants make up nearly 40 percent of the nation’s total carbon dioxide output, that means an instantaneous, and permanent, 12 percent rise in carbon emissions. If only the GNP did so well.

Since the accident at TMI, even this dubious trade-off has been widely promoted by nuclear opponents, and not only on our side of the Atlantic. In Europe—especially after Chernobyl—one could almost hear the brakes squealing in the nuclear industry. Within months, the Italian government had begun to phase out nuclear energy; Germany, which has historically produced about 30 percent of its electricity from nuclear plants, began a two-decade melee over the industry, which resulted in a 2000 decision to eliminate it entirely; and Switzerland, Spain, Holland, and Austria have all followed a similar path—even while fast-growing Asian countries like China and India have moved in exactly the opposite direction, rushing to build nuclear plants, with nearly a dozen currently under construction. By 2020, China hopes to quadruple its nuclear-power output.

This has yielded strange ironies between the nations running from nuclear power and those running to it. Without nuclear plants, for example, Germany will no longer be able to meet its own energy needs, and it plans to import a growing chunk each year from France, which remains one of the last European nations to embrace nuclear power and currently produces nearly 80 percent of its total electricity that way. (Providing France, incidentally, with the cleanest air in Europe, the lowest electrical costs, and the greatest percentage of electrical independence, not to mention $4 billion a year in electrical exports.) That means for every ten megawatts of power that Germany plans to purchase from France in the coming years, eight megawatts will be coming from the very technology that requires Germany to import power in the first place.

A similar case exists in the American West. For political reasons, the state of Nevada (home of the dirtiest coal plant in the country, according to a recent report by the Environmental Integrity Project) opposes almost everything with the word nuclear in it. But with the rapid growth of Las Vegas in recent years, the state is unable to generate its own power and currently imports as much as 15 percent of its electricity from California and Arizona. Of course, since they produce 14 percent and 23 percent of their power at nuclear plants, respectively, that means Nevada, which likes to proclaim itself “nuclear-free,” actually gets a considerable amount of its power from nuclear plants, too—but at markup prices that profit California and Arizona.

At the federal level, U.S. policy has never mirrored the European retreat from nuclear power, and the Bush administration has actually been as enthusiastic about nuclear energy as it is about every other form of energy. Even so, over the past thirty years, the U.S. government has not issued a single license to build a new nuclear plant. There are two reasons for this. One is that public -opposition has made the development process prohibitively expensive for the industry. In the aftermath of TMI, a number of legal challenges were initiated against plants under construction, and many companies simply abandoned their unfinished projects. Within five years of TMI, some fifty reactors had been scrapped, many of them after billions of dollars’ investment.

But an even bigger challenge to the industry’s development may be the one at the other end of the nuclear process: not the cost of building new plants but the cost of storing old waste.

Since the development of nuclear power in the 1950s, the storage of waste has always been a priority of the federal government. Nuclear waste, after all, is even more dangerous than nuclear fuel, so politicians on both sides of the aisle have long agreed that it is in the national interest to store it safely. And for the past thirty years, Congress has worked to do so, researching the options for a national repository and then, over the past fifteen years, building a multibillion-dollar facility deep in the Nevada desert, more than a thousand feet underground, at a site called Yucca Mountain.

But so far, not a gram of waste has ever been sent there. That’s because the people of Nevada, by a large majority, believe the repository should be in somebody else’s backyard and have thrown up legal challenges at every stage of the site’s development. With the ascension of Nevada’s Harry Reid to the position of Senate Majority Leader in 2006, the state’s quest to block Yucca Mountain seems more likely than ever to succeed. Although development continues, progress is glacial, and Congress has been, to put it mildly, slow to grant approvals. As recently as January of this year, congressional budget cuts forced the repository to fire nearly all its on-site employees, scaling back to a mostly administrative operation. When I asked Senator Reid what would happen to the repository in the coming years, he minced no words. “It will never happen,” he said flatly.

What this means for the nation’s 132 million pounds of radioactive waste is not hard to predict. In politics, as in most things, there is no such thing as a nondecision, and the failure to open Yucca Mountain is an implicit decision to keep the waste where it sits today: dispersed among 121 sites in thirty-nine states, including the nation’s 104 nuclear-power plants—all in temporary facilities. Unfortunately, because each of those plants must be located near a large supply of coolant water, that means fifty years of radioactive waste—growing by about 2,000 metric tons every year—is currently being housed in places like the Indian Point Energy Center, less than twenty-five miles upstream from New York City on the banks of the Hudson River. The irony is that almost nobody on either side of the nuclear debate, not even some of Yucca Mountain’s most ardent critics, actually believes that storing waste at the power plants—dispersed, exposed, with watershed to urban areas—is anywhere near as safe as Yucca Mountain for the long term. Yet there it remains, while Yucca Mountain, deep in the Amargosa Desert, sits empty.

n photos, the twenty-five mile ridgeline of Yucca Mountain appears vast and sprawling, but in person the mountain is an unimpressive sight, more like an oversize anthill than any of the real peaks that surround it. Slumped along the California border about a hundred miles northwest of Las Vegas, the 1,200-foot “mountain” is officially part of the mysterious Nevada Test Site, which abuts the even more mysterious Area 51, but technically the repository is managed not by the military but by scientists at the U.S. Department of Energy. After $10 billion in development costs and thirty years of observation, it is safe to say that Yucca Mountain has become the most expensive, examined, and—so far, anyway—useless hunk of rock on earth.

To get there, I flew into Las Vegas and joined a team of DOE scientists for the two-hour drive across the desert, led by the former chief scientist for Yucca Mountain, a stout man in his sixties named Michael Voegele, with a jutting jaw and floppy chocolate bangs that fell into his eyes. Voegele was a repository in his own right—full of a stupendous range of arcane minutiae about the project, some of it fascinating, some of it otherwise, and most of which he managed to pack into a rhapsodic soliloquy as we made the drive. “In the early 1930s…,” he began, and for the next two hours, while the other DOE scientists and I listened in some combination of thrall and disbelief, he held forth on the nonpublic history of the Nevada Test Site, the chemical intricacies of uranium fission, the legal nuances of the 1982 Waste Policy Act, and the latest concepts in plate-tectonic theory, pausing only briefly to glance out the window and mutter things like, “Now, if you look out there for a minute, you can see Creech Air Force Base, where they steer the Unmanned Aerial Vehicles flying over Iraq and Afghanistan.…”

Voegele had been a principal at Yucca since its inception and was as enthusiastic about the project’s merits as he was encyclopedic on its history. In the late 1970s, when power plants had begun pressuring the federal government to retrieve and store the waste in their temporary facilities, Congress assigned nine groups to study the options, and Voegele was on the team to research Yucca. With a doctorate in geological engineering and a background in chemistry, he soon came to believe that Yucca Mountain possessed a variety of chemical and physical properties that made it an ideal site for waste disposal. The mountain was nearly half a mile above the water table and more than a hundred miles from any populous area. The layers of rock in the mountain formed a natural shield from most rainfall, Voegele believed, and the composition of that rock created a natural filtration system for the small amount of water that did creep in. Since water is thought to be the most likely way for radiation to escape a repository, these properties would prove essential to Yucca Mountain’s viability.

By the end of 1987, Voegele’s proposal for the mountain had made it through two rounds of cuts by the Department of Energy—from nine options to five, and then down to three—when a lame-duck session of Congress brought the search to an abrupt end, passing a bill titled the Omnibus Budget Reconciliation Act of 1987 but known locally as the “Screw Nevada bill,” which just happened to say in fine print that the other two sites would no longer be considered.

“It was obviously political,” Voegele conceded as we barreled across the desert. “The other two sites were Texas and Washington, and they had a lot more horsepower than Nevada. The story goes that the Speaker of the House, Jim Wright, who was from Texas, walked into the office of the Majority Leader, Tom Foley, who was from Washington, and they just looked at each other, and Foley said, ‘It’s Nevada, isn’t it?’ ” Voegele laughed and shrugged. “But coincidentally, I think DOE’s technical information was leaning toward Yucca Mountain anyway. The site in Washington was below one of the largest aquifers in the Northwest, which drains into the Columbia River, and the Texas site was right below the Ogallala Aquifer, which is the largest aquifer in the United States.”

As we pulled up to the Air Force checkpoint, Voegele’s voice began to trail off, and we passed through the gate into the Nevada Test Site, which looked almost identical to the desert we had been crossing outside—brown, dry, littered with creosote and blackbrush—except for the presence of massive experimental structures scattered about, like the 1,500-foot-tall BREN Tower standing alone in the distance, measuring the effects of radiation from precisely the altitude of the Hiroshima bombing.

Several miles in, we reached the base of Yucca and steered the truck up a long ridgeline until we crested to a 360-degree view of the desert. We stepped out, and Voegele threw his arms wide in the shadowless midday sun. “This piece of rock is twelve and a half million years old,” he said to no one in particular. In the distance, we could see the tallest point in the contiguous United States, Mount Whitney to the northwest, and to the southwest, the Funeral Mountains perched above the nation’s lowest point, Death Valley. Scattered in between, on the desert floor, volcanic cinder cones sat like rounded black pyramids.

“The tunnels are 1,200 feet below where you’re standing right now,” Voegele said. “The water table is a thousand feet below that. Everything else is layered rock.”

Exactly how those layers of rock might or might not protect the repository from rainfall is what geologists at the DOE and the Nevada government spend most of their time debating. What is clear is that the rock consists of four principal layers, which alternate between a hard and relatively brittle material known as welded tuff and a sponge-like material known as nonwelded tuff. Proponents of Yucca, like Voegele, find in these layers an almost divine confluence of desirable traits. On the surface, where we stood, the harder material cloaks the mountain, shedding most rainfall down the sides and into the surrounding plains. The water that does seep into the cracks, advocates say, would have to travel 300 feet through fissures in a layer called Tiva Canyon, saturate a hundred feet of the softer rock, and then continue through fissures in another several hundred feet of hard rock known as Topopah Spring in order to reach the repository. Even then, to be dangerous, that water would first have to penetrate the metal canisters that are molded around the waste, become irradiated, continue down through several hundred more feet of hard rock, and fill yet another layer of spongy rock known as Calico Hills, before finally reaching the water table, where it might, depending on whose data you believe, either surface a century later in the middle of Death Valley, or else not at all. Also, since the spongy layers of Yucca Mountain happen to be rich in minerals known as zeolites, which are known to neutralize radioactivity, the mountain’s advocates tend to speak with an eerie certainty (some might say religious fervor) about the intelligent design at Yucca Mountain. “All the stars aligned for Yucca Mountain,” Voegele said. “Here’s what it comes down to: Even if the water gets through all the physical barriers, the zeolites would absorb 99.9995 percent of the radionuclides. We know that. The other .0005 percent is what the debate is about.”

Opponents of the repository, of course, tell a different story. For one thing, they don’t believe that the mountain will repel water. But even more important, they say, Yucca Mountain sits on top of a seismic fault. Even if the mountain were some rain-defying wonder of the world, a minor earthquake could change that very quickly. As we stood on the mountain’s summit, it was easy to appreciate this point of view. In addition to the cinder cones peppering the valley floor, a testimony to previous volcanic activity, the geology directly under our feet—Yucca Mountain itself—was actually made of compressed volcanic ash. That’s what welded and nonwelded tuff are. When I asked the head of Nevada’s Agency for Nuclear Projects, Bob Loux, about this somewhat unsettling fact, he was quiet for a moment, searching for diplomatic language. Instead, he said this: “We have zero trust in the DOE. We believe they’re entirely incompetent. Most people might be surprised that they would claim there is no risk of volcanoes and earthquakes on a seismic fault. But in Nevada, we’re used to hearing things like that. They used to tell us that there were no risks from nuclear-weapons testing. Well, of course there were. A lot of people got sick and died. My dad was a career DOE employee at the Nevada Test Site, and I can tell you that the culture of the DOE is all about downplaying risks. They are flagrant about it. When my dad was working for them, it caused a lot of problems between us. My mom used to have to step in and say, ‘Knock this shit off.’ But after my dad retired, he came up to me and said, ‘You know what? Everything you’re saying about these guys is right.’ ”

In the absence of agreement between the state and the DOE, Congress has asked the Environmental Protection Agency to arbitrate, and in 2001 the EPA released a set of standards for Yucca Mountain: Until otherwise directed, DOE scientists must assume that the state’s concerns are valid and still be able to prove that an earthquake or a volcano under Yucca Mountain would not release more than a certain level of radiation into the atmosphere. That level, according to the EPA guidelines, must be a maximum of 15 millirems per year for the first 10,000 years, and for the period between 10,000 years and a million years, the standard rises to 350 millirems per year. To put it another way, for twice the duration of recorded history and five times as long as the Christian era, a person living on top of Yucca Mountain during an earthquake can only receive as much radiation as a single chest X-ray, and a person living there a million years in the future—that is, in the year 1002008—can only get about half of what the average Seattleite gets today.

If that seems impossibly stringent, Voegele noted, it’s a sign that you don’t work for the state of Nevada. As we loaded back into the truck and headed down to explore the tunnels, he pointed out that the state’s advocates, like Loux, still believe the standard is too easy.

“They used to say, ‘How can you assume a honking big earthquake won’t hurt anything?’ ” he said, shaking his head in disbelief. “Well, we don’t. We can’t! We have to assume a 6 on the Richter scale, and we’ve been able to demonstrate that it’s safe. So now they’re saying, ‘Well, it shouldn’t be 350 millirems, it should be 15 millirems for the whole million years.’ ” He snorted. “When you start asking for proof of what’s going to happen in a million years, you’re not a serious scientist.”

We rounded a bend into the parking lot for the tunnels, and the scale of the mountain seemed to magnify suddenly. From the summit, Yucca had been dwarfed in the shadow of Mount Whitney and the vast emptiness of the Amargosa Desert, but here at the portal it became cathedralesque, its mouth yawning thirty feet high against a sheer gray cliff, with massive ventilation hoses spilling from the top and a set of railroad tracks plunging into the darkness. As we walked down an elevated gangplank into the cavern, a fifteen-mile-per-hour headwind blew past us, the work of colossal ventilating machines sucking radon gas from the ground. The tunnel stretched endlessly into the distance, a perfect cylindrical void created by a twenty-five-foot-diameter drill known as a Tunnel Boring Machine, which the DOE engineers had literally hitched up and driven five miles through the mountain. Along the sides of the tunnel, circular steel ribs braced the walls every ten feet, giving the passage a skeletal feel, like the inside of a paper dragon.

“This is Alcove 5,” Voegele said as we turned off the main hallway into a shallow cavity. “When waste is brought into the repository, it’s going to be hot. So we conducted tests in here to see how the heat would affect the rock. We raised the temperature above 400 degrees—much higher than the boiling point of water—and kept it there for four years.” In other regions of the mountain, scientists took samples of the rock and exposed them to massive radiation, pressure, fractures, chemical baths, and sonic explosions, meeting the EPA standard—in simulations the agency found sufficient to certify—every time. “You are not going to find another site which can withstand this kind of scrutiny,” Voegele said.

Not all the mountain’s opponents dispute these results on scientific grounds; just as often, their objection rests on a simpler and more resolute fact: No matter what the experiments show, nothing can guarantee how real waste might behave inside the mountain.

This, of course, is undeniable. Yet it is also a standard by which any repository would be ruled out, since experiments and simulations are the only way to evaluate any site. But most of Yucca Mountain’s prominent critics, like Reid and Loux, insist they do support a repository…as long as it’s somewhere else. All of which begs the question: Where? In any location, the storage of radioactive waste must entail at least a small measure of risk, and the search for a solution without that risk—a perfect option—can obscure the options that are merely good, or good enough. The great can be the enemy of the good.

As we drove back toward Las Vegas, Voegele was mostly silent, but when we crested the final hill above the city, he spoke up. “There is an ethical dilemma at Yucca Mountain,” he admitted. “When Jimmy Carter was president, he said that our generation created this waste and we shouldn’t push it to future generations. That’s a very noble thing to say. But the fact is, we have to be careful how we interpret that. We have taken that to mean that Yucca Mountain has to last forever—we can’t expect future generations to fix anything or improve anything, ever. Well, that’s the wrong way to look at it. We should do the best we can right now, but no matter what we do, future generations will be able to change things at Yucca Mountain, they will have more knowledge and experience than we do, and they will probably want to change the system we create. They can do any number of things. They can move the material somewhere else; they can store it in a different way; they can change its chemical composition or reduce the radioactivity with methods we don’t know about. But right now, we don’t have any better options. We can’t leave this waste at the power plants forever. And we’re not going to find another repository without running into the same problems we have now. The bottom line is, Yucca Mountain is the best option we have. If we don’t use it, I don’t know what we’re going to do.”


One of the things we’re not doing is recycling. In other countries, this is known as “reprocessing” fuel, and it is neither complicated nor particularly expensive. It has, however, been illegal in this country for most of the past three decades.

The science of reprocessing is fairly simple. When uranium fuel is removed from a power plant, it is classified as waste, but there is still a tremendous amount of nuclear material inside. The reason the fuel must be removed from the plant is not because it has spent all of its potential but because, along the way, it has generated a variety of dangerous by-products, like cesium and strontium, which are far more radioactive than uranium itself. After a while, these elements would become so prevalent, and so volatile, that it would not be safe to keep them inside the plant, so the fuel rod must be removed.

But cesium and strontium aren’t the only by-products of nuclear power. There is also the element plutonium. And plutonium, as it happens, is a very useful by-product—capable of powering a nuclear plant just like uranium. In fact, by the time a fuel rod is typically removed from a power plant, most of its uranium is spent, and its power is coming from the plutonium. Yet once the fuel rod is decommissioned, that plutonium is locked away with the cesium and strontium. Simply by extracting the plutonium and putting it back into the reactor, a power plant could generate thousands of megawatt hours of additional power.

This is not just theoretical. In England, for example, virtually every speck of waste that comes out of the nation’s nineteen nuclear reactors is harvested for plutonium and then put back to use. The process has been so effective that the British actually began accepting waste from other countries, like Japan and Switzerland, extracting their plutonium and then shipping it back to be burned. In Japan, the results have been so lucrative that the government is opening its own reprocessing facility. Likewise, in Germany, where nuclear power is being phased out by the government, those plants still in service not only can reprocess their waste; until very recently, they were required to. And last year, the French government renewed its commitment to reprocessing as well.

If the U.S. nuclear industry were simply to join these countries, the benefits would be instantaneous. First, since every gram of “waste” currently waiting for Yucca Mountain contains at least some plutonium, the decision to harvest that waste could instantly eliminate the 132-million-pound backlog waiting for Yucca to open. With the stroke of a pen, virtually every gram of the nation’s spent fuel could be moved from the “waste” column into the “fuel” column. Perhaps even more important, the energy recovered from that “waste” would provide a windfall of power: hundreds of millions of megawatt hours, or enough to meet the entire country’s electrical needs for years.

Why is the United States determined not to reprocess fuel, choosing instead to put thousands of pounds of plutonium into a scrap heap? The answer, like so many things in the nuclear debate, has more to do with fear than reason. The decision to ban reprocessing, it turns out, emerged in the 1970s from a concern that plutonium, which can also be used to make a nuclear bomb, might be difficult to keep away from terrorists. What seems not to have registered is the reality that the U.S. government already sits on thousands of pounds of military plutonium, which it has guarded without incident at least since the Nagasaki bombing of August 1945. In fact, the government currently sits on so much plutonium that it is preparing to reduce its stockpile by…giving it to nuclear-power plants for fuel. The fact that those plants already own enough plutonium to fire their plants for years, and that using their own plutonium would save them millions of dollars in waste-storage costs, providing untold wattage of electrical power even while eliminating the backlog for Yucca Mountain, gets lost in the shuffle because, well, plutonium is scary and we shouldn’t keep it around. Even though we already do.

This is not to say that reprocessing would render Yucca Mountain completely obsolete. Even after the waste is reprocessed and the plutonium removed, all the other by-products like cesium and strontium must still be stored for the long term. Countries that reprocess fuel have not yet found an ideal way to do this, encountering many of the same storage problems the U.S. faces at Yucca Mountain. But at least in those countries the “waste” really is waste, whereas the American “waste” is wasteful in itself—and raises a special conundrum all its own: By the year 2012, the volume of “waste” in line for Yucca Mountain will exceed the available space inside Yucca Mountain. As Michael Voegele told me when I visited, “Yucca Mountain will be full the day she opens. Every space is already accounted for.”


As the nuclear puzzle continues to challenge us, many of these policies are on track to collide, creating second- and third-generation problems. Without new nuclear plants, for example, the American power supply will not simply remain as it is; as time passes and nuclear plants grow older, we will have to choose between extending licenses to those plants, far beyond their intended life expectancy, or else closing them and increasing our dependence on fossil fuels. Similarly, without a long-term repository for our waste, we must prepare for the reality that our temporary facilities, scattered among those aging power plants, are growing older, too—increasing the likelihood of a waste-related leak or spill, and then, even greater public fear of the plants. Finally, without the option to recycle our waste, we must continue to harvest new uranium from the ground, but the volume of uranium deposits in the United States is dwindling, and a series of ill-considered policies over the past twenty years have all but wiped out the American uranium miner. In mines from the Colorado Plateau to the Wyoming plains, I repeatedly found miners struggling with the government’s decision in the early 1990s to buy Soviet uranium and liquidate its own stockpile, which flooded the market (already struggling after TMI and a post–Cold War ebb in weapons spending) and drove nearly every American uranium company out of business. Today most of the world’s biggest known uranium reserves are in Canada, Australia, and…Kazakhstan. As one American uranium miner told me when I visited his floundering operation on the Gulf Coast, “It would be a shame to wean ourselves from foreign oil, only to become dependent on foreign uranium.”

At the same time, the stalemate at Yucca Mountain is quickly compounding: Over the past decade, a host of nuclear companies have lost faith in the project and have begun filing lawsuits against the federal government, asking to be compensated for the cost of storing their own waste. Some of those companies have already received millions of dollars in settlement fees, and over the next decade, other suits may result in hundreds of millions of dollars more—creating a somewhat bizarre contradiction for taxpayers, who are on the one hand paying millions of dollars to build Yucca Mountain and, at the same time, paying millions of dollars not to.

And yet, like so much of the nuclear-policy debate, none of this reflects on the potential of nuclear technology itself.

It reflects on us.

When the United States is determined not to recycle fuel, we pay for that policy every day in rising utility bills, yet most of us neither know nor care about the massive untapped energy source hidden in our pile of “waste.”

When Harry Reid insists on radiation limits at Yucca Mountain that make the repository impossible to open, most of us nod in silent agreement, yet the risk becomes immeasurably higher for more than half of all Americans, who live within fifty miles of a plant, their water supply flowing past toxic waste every day.

When a meltdown like Three Mile Island scares us blind, creating an apocalyptic mythology about what happened there, we pay for our superstitions with sulfur fumes, global warming, and acid rain—our suicide pact with coal.

And when we fail to consider each of these issues with reason instead of fear, when we fail to make the tough comparison between nuclear power, with its potential for disaster, and coal plants, with their guarantee of it, this isn’t a reflection that we have no choices but that we refuse to make them.

It may be, more than anything else, an example of democracy working and failing at the same time.

wil s. hylton is a gq correspondent.

Wednesday, February 27, 2008

Google's Rapid Growth in Datacenters and Green IT

by Triniman

Google continues to remain one of the most interesting and not surprisingly, secretive technology companies in the world. Their rapid expansion, together with their goal to maintain as small a carbon footprint as possible (“Green IT,” as Gartner calls it), is as fascinating as it is challenging.

Back in March 2001, Google had about 8,000 servers at four server farms. In January 2005, Google told the CBS television show 60 Minutes they had around 100,000 servers. Estimates for 2006 indicate they had 450,000 servers located in 25 global datacenters. Last year, it was estimated that they had around 500,000 servers spread over 60 datacenters. There are five new U.S. datacenters under construction and with around 8,000 servers each, they could have around 550,000 servers in 2008. They also buy 10,000 servers a month. When one fails, instead of repairing it, they discard it and plug in another one.

According to Wikipedia, “The size of Google's search system is presently unknown; the best estimates place the total number of the company's servers at 450,000, spread over twenty five locations throughout the world, including major operations centers in Dublin (European Operations Headquarters) and Atlanta, Georgia. Google is also in the process of constructing a major operations center in The Dalles, Oregon, on the banks of the Columbia River.”

In June of 2006, the New York Times reported that the Oregon site is known locally as Project 02 and it was selected due to the reliable availability of electricity from the nearby Dalles dam and the Celilo Converter Station, as well as abundant fiber optic cable left over from the excessive rollout during the dot com boom. The building is expected to be the size of two football fields, and has not only created hundreds of construction jobs, but the surrounding real estate values have increased by 40%. The cooling towers will be four stories high. The server farm will create between 60 and 200 jobs in the town of 12,000 people.

Google’s datacenter in Dalles, Oregon, is about a mile away from the Dalles Dam. In the old days, as someone recently pointed out, aluminum smelting companies tried to locate their operations near hydro electric dams. Now we’re seeing the same happening with Google’s datacenters. Google’s corporate philosophy also includes a strong environmental bent, and their charitable arm supports such initiatives. Not surprisingly, they have constructed a datacenter powered by a wind farm in the Netherlands, with another one to be built in Council Bluffs, Iowa. The $600 million project will eventually employ 200 people and should be fully operational by 2009. While Google won’t talk about it very much, so they don't give away the actual size of their operations, they have a goal to be carbon neutral in 2008.

According to the article “Some of Google's Secret to Success,” in from July, 2007, Google started with racks of motherboards, mounted four to a shelf on corkboard, with cheap generic hard drives densely packed together like “blade servers before there were blade servers,” according to one Google executive. Today, they can send prefabricated datacenters anywhere in the world by packing them into standard 20- or 40-foot shipping containers. In fact, last year they received a patent for the concept called “Modular Data Centers,” something they applied for back in 2003 despite possible prior art.

At a January, 2008, dinner meeting for the Winnipeg section of the Canadian Information Processing Society, speaker Stephen Kleynhans, Vice President Research at Gartner Inc., the leading IT research and advisory firm in the US, said that Google needed access to power and cooling and that there are also unsubstantiated discussions going around that they are considering locating server farms up north to take advantage of the cooler temperatures. Also, data transmission does not diminish over long distances on power lines, but power does. The power bill will be lower if you build the server farms up north where it’s naturally colder instead of concentrating on power sources like hydro dams or wind farms. The existing power lines can be used to move the data to major communication hubs.

A few years ago, the idea of utilizing power lines to bring broadband Internet to rural populations in the U.S. with the downside of conflicts with amateur radio transmissions was developed. There’s little information about this concept that can be Googled, but one has to wonder if the Internet giant will open datacenters in Manitoba’s north someday. The Canadian province has the hydro lines, northern populations to serve as potential employees, and an abundance of colder than average daily ambient weather.

Tuesday, February 26, 2008

Sun Leaks 6-core Intel Xeon, Nehalem Details

Sun Leaks 6-core Intel Xeon, Nehalem Details
Kristopher Kubicki (Blog) - February 25, 2008 3:11 AM

Sun confirms what Intel has been dying to tell us, at least off the record

Late last month in Austria, Intel presented Sun with roadmaps discussing details of its upcoming server platforms, including the fairly secret Xeon Dunnington and Nehalem architectures. Unfortunately for some, this presentation ended up on Sun's public web server over the weekend.

Dunnington, Intel's 45nm six-core Xeon processor from the Penryn family, will succeed the Xeon Tigerton processor. Whereas Tigerton is essentially two 65nm Core 2 Duo processors fused on one package, Dunnington will be Intel's first Core 2 Duo processor with three dual-core banks.

Dunnington includes 16MB of L3 cache shared by all six processors. Each pair of cores can also access 3MB of local L2 cache. The end result is a design very similar to the AMD Barcelona quad-core processor; however, each Barcelona core contains 512KB L2 cache, whereas Dunnington cores share L2 cache in pairs.

To sweeten the deal, all Dunnington processors will be pin-compatible with Intel Tigerton processors, and work with the existing Clarksboro chipset. Intel's slide claims this processor will launch in the second half of 2008 -- a figure consistent with previous roadmaps from the company.

The leaked slide deck also includes more information about Intel's Penryn successor, codenamed Nehalem. Nehalem is everything Penryn is -- 45nm, SSE4, quad-core -- and then some. For starters, Intel will abandon the front-side bus model in favor of QuickPath Interconnect; a serial bus similar to HyperTransport.

Perhaps the most ambitious aspect of Nehalem? For the first time in 18 years Intel will pair its processors cores up with on-die memory controllers. AMD made the switch to on-die memory controllers in 2003. For the next three years its processors were almost unmatched by Intel's offerings. The on-die memory controller can't come a moment too soon. Intel will also roll out tri-channel DDR3 with the Nehalem, and all that extra bandwidth can only be put to use if there are no bottlenecks.

As noted by ZDNet blogger George Ou, the slides contain some rudimentry benchmarks for Nehalem and other publicly available processors. From this slide deck, Ou estimates Nehalem's SPEC*fp_rate_base2006 at 163 and the SPEC*int_rate_base2006 at 176. By contrast, Intel's fastest Harpertown Xeon X5482 pulls a measly 80 and 122 SPEC fp and int rate_base2006.

The Nehalem processor more than doubles the floating point performance of its current Penryn-family processors. Ou adds, "We’ll most likely know by the end of this year what the actual scores are, but I doubt they will be more than 5% to 10% off from these estimated projections."

It's important to note that these estimates are not actual benchmarks. Intel's document states, "Projections based on *SPECcpu2006 using dual socket Intel Xeon 5160 Processor performance as the baseline." As discussed on DailyTech before, simulated benchmarks offer little substance in favor of the real deal.

As of February 2008, the company plans to launch Nehalem in Q4 2008.

Sun has since removed the slide deck from its website.

An document consistent with Intel roadmaps details the company's upcoming 6-core processor

The same slide deck details side-by-side projections of Nehalem with other AMD and Intel processors

George Out estimates the floating point point and integer performance by extrapolating datapoints on the slid (Source: ZDNet)

New supercomputer is a rack of PlayStations

Louisa Hearn
February 26, 2008

When the PlayStation3 was released in November 2006, Gaurav Khanna's wife braved long queues so he could be one of the first people in the US to get his hands on the gaming console.

But the astrophysicist was not itching to burn some rubber in Gran Turismo or shoot hoops in NBA 07. Instead he wanted to build his own supercomputer.

Mr Khanna now owns 16 PS3s, which spend their days simulating the activities of very large black holes in the universe for the physics department at the University of Massachusetts.

Hooked together in a single cluster, the PS3 consoles provide his department with the same amount of computing power as a 400-node supercomputer.

"The challenge these days with supercomputing facilities is that there is a lot of demand for them. So even if I submitted a job that would be expected to take about an hour, it could actually take two days to get started because the queues are so long.

"The PS3 cluster is all mine and was very low cost to set up, which makes it really attractive," he says.

What makes the gaming console vastly superior to high-end computers for complex research algorithms, Mr Khanna says, is the Cell chip built by IBM to facilitate high-end gaming functions on the latest generation of consoles.

In addition the PS3 was built with an open hardware architecture, which can run the Linux operating system.

Based on Einstein's theory of relativity, Mr Khanna's research on black holes is purely theoretical. In order to run his simulation data on the console he has to reprogram it so the algorithms will work on the new architecture.

"Linux can turn any system into a general purpose computer but for it to do work for me I have to run my own code on it for astrophysics applications. The hard part of the job was to make sure my own calculations could run fast on the platform, which meant I had to optimise the written code so it could utilise the new features of the system.

"I am not a Linux person - I am a Mac person - but I was able to follow instructions online," he says.

His next challenge will be to turn his data into graphical simulations using the high end graphics engine included in the PS3.

"We haven't done that yet but it would be very neat to actually see the simulation while it is going on," he says.

Although Mr Khanna was one of the first scientists to optimise the PS3 for his own research work, Tod Martinez, Professor of Chemistry at the University of Illinois has been tinkering with games consoles ever since his son's original PlayStation malfunctioned.

He says that while researching whether or not to buy a new PS2 for his son he also began to explore the possibilities of using gaming consoles for scientific research.

"The two main things we do are rotations and translations of objects. We also need to get a lot of pixels onto the screen, which means we need big channels to move lots of data. It was pretty clear that modern games consoles mapped really well onto theoretical chemical calculations," he says.

Once the PS2 was released he bought one unit for his son and a few units for himself and made some rudimentary attempts to program it. "But back then the architecture was proprietary and trying to convince the machine to run non-Sony programmed games was difficult."

That situation improved rapidly when Sony released a DVD that would allow users to run Linux on the console. "Only 1000 DVDs were released but they really opened up the architecture considerably," he says.

Since that time Mr Martinez has refined his computing resources considerably and he now runs a cluster of eight cut-down consoles from IBM based on the same Cell chip technology used in the PS3.

He has also expanded his use of gaming technology into graphics cards with the help of a new programming framework developed by graphics hardware specialist Nvidia.

Mr Martinez's field of research is examining how molecules behave when you shine light on them - which has wide ramifications for the fields of agriculture, solar energy and the study of human vision.

"We have done tests of algorithms and Nvidia cards are four to five times faster than the Cell chip, which is 20 times faster than an ordinary high end computer," he says.

Because both technologies can be classified as "stream processers" they are highly suited to moving massive volumes of data - unlike the general purpose processing for ordinary computing.

"Some people think it's just about having a faster computer. They don't realise how big a change it is to do computing at your desk after accessing a computer in a room somewhere where you have to wait around for results.

"Of course it does cost less, but what needs to be recognised is that it also changes the way people think about problems when they are given a hundred times more computer power."

Using the example of a black box, Mr Martinez explains that instead of asking basic questions about how it works, you can just start tinkering around with it.

"So rather than taking the thing apart you just start moving all the knobs about to see what happens when you change something - just as you might in real life."

This story was found at:

Dell Drop Kicks Proprietary Parts

Dell Drop Kicks Proprietary Parts
Posted 02/26/08 at 01:25:52PM | by

Gordon Mah Ung

Dell, long dinged for using proprietary hardware in its gaming PCs, has seen the light. The company said such annoying traits such as proprietary motherboards and power supplies is now a thing of the past.

The first XPS to shed the proprietary hardware will be the new budget XPS630 gaming machine. Based on the nForce 650 ichipset, Dell claims you can swap the board, PSU out for any other ATX-spec hardware.

The change is a long overdue. In the past, Dell has used designs that looked ATX-like but were actually not. If you tried to swap the power supply in your Pentium III Dimension XPS B 733R years ago, you would have been greeted by charred motherboard as the company actually wired its PSU’s differently than the industry but did not key them differently. For years, PC Power and Cooling has made small side business selling Dell upgrade PSUs. More recently, the company has been called out over BTX support and even using a proprietary power plug in its more recent XPS gaming rigs.

Why the use of proprietary designs? Cynical observers have said the company was just trying to lock customers into buying parts exclusively from Dell. The company has long defended the practice by saying that the variations from spec were because its engineers found the specs to be lacking. But the heat from critics and machines such as Hewlett-Packard’s Blackbird 002 going all ATX apparently have forced Dell to see the light. Company officials said the mantra in Dell engineering is that varying from the spec’s must be avoided at all costs.

The change from proprietary parts won’t be the only new trick for the XPS630 though. Dell has taken a page from Hewlett-Packard and claims the XPS 630 will support either Nvidia’s dual-card SLI or AMD’s dual-card CrossFire cards. How can Dell do this? Maximum PC spoke to AMD graphics officials who said the capability is being offered only to select OEMs who take the responsibility for making sure the drivers fully work with the BIOSes on the motherboards.

So why not just release such a driver to the public to let any nForce-user run CrossFire? The company said it is worried that a certain company could affect the performance of its cards when CrossFire is run on an SLI board so public support just isn’t going to happen. In the case of Dell and HP, AMD feels both have enough influence to keep performance problems from cropping up on their systems. For now, the CrossFire support is only through drivers obtained directly from Dell.
Dell says the XPS 630 will also be the first tier one OEM system to support Nvidia’s Enthusiast System Architecture. In the XPS 630’s case, ESA will let the user control the lights in the system. ESA support for the PSU or other components will not be initially supported.

The $1,300 version of the XPS 630 will ship with a 2.4GHz Core 2 Quad Q6600, GeForce 8800GT and a 750 watt power supply. Dell said the BIOS on the XPS 630 will support overclocking and is upgradeable to both dual and quad-core Intel Penryn CPUs.

Sunday, February 24, 2008

SXSW 2008 - Free download of 764 MP3s in a torrent

For the past four years, the annual South By Southwest collection of festivals from Austin, Texas, has offered free downloads of MP3s from the hundreds of bands who play showcasing events in the clubs. Now in its 22nd year, they have decided to not offer a convenient torrent which provided all the songs in one download.

Fortunately, Greg Hewgill, one of the early seeders of the annual torrent, has taken it upon himself to create his own unofficial torrent file containing MP3s from 764 bands who are scheduled to appear this year, between March 12 to 16, 2008.

You can still download individual MP3s from each of the 764 bands, from the official SXSW 2008 site. Or, for those of you who have missed the previous torrents, you can download them all from this page on Greg Hewgill's blog.

The Music and Media Conference (March 12 - 16) includes a music festival, conference and trade show. The conference keynote speaker this year is Lou Reed. Previous keynote speakers have included Pete Townshend, Neil Young, Robert Plant, Daniel Lanois, Johnny Cash, Steve Earle, Ray Davies and Robbie Robertson. You can view the list of the over 1700 bands who will be performing on the SXSW website, which includes links for downloading MP3s.

The Interactive Festival (March 7 -11) is all about cool, emerging technologies and includes the 11th annual SXSW Web Awards.

The Film Festival (March 7 - 15) features around 250 films, which includes feature and and short films. It also includes four days of conferences with panelists, dubbed the "Four-Day Film School."

According to the SXSW 2007 demographics site, SXSW 2007 hosted more than 123,000 total attendees:

  • more than 24,000 conference registrants
  • approximately 12,000 music festival wristband holders
  • more than 35,000 single admissions to the film and music events
  • more than 15,000 Flatstock poster show attendees (free-to-the-public, held during the SXSW Music Conference)
  • 32,000 Town Lake Stage concert attendees (SXSW's largest free-to-the-public stage)
  • more than 5,000 ScreenBurn gaming arcade attendees (free-to-the-public, held during SXSW Interactive)

Saturday, February 23, 2008

Gartner's 10 predictions for IT industry for 2008 and beyond

Gartner's 10 predictions for IT industry for 2008 and beyond


January 31, 2008

Leading IT industry analyst firm Gartner Inc has highlighted 10 key predictions of events and developments that will affect IT and business in 2008 and beyond.

The predictions highlight areas where executives and IT professionals need to take action in 2008. The full impact of these trends may not appear this year, but executives need to act now so that they can exploit the trends for their competitive advantage.

"Selected from across our research areas as the most compelling and critical predictions, the trends and topics they address this year indicate a strong focus on individuals, the environment, and alternative ways of buying and selling IT services and technologies," said Daryl Plummer, managing vice president and Gartner Fellow, in a media release on Thursday.

The predictions that Gartner has made for the year 2008 focus on general technology areas rather than on specific industries or roles. These include:

1. By 2011, Apple will double its American and western European market share in computers: Apple's gains in computer market share reflect as much on the failures of the rest of the industry as on Apple's success. Apple is challenging its competitors with software integration that provides ease of use and flexibility; continuous and more frequent innovation in hardware and software; and an ecosystem that focuses on interoperability across multiple devices (such as iPod and iMac cross-selling).

2. By 2012, 50 per cent of traveling workers will leave their notebooks at home in favour of other devices: Even though notebooks continue to shrink in size and weight, traveling workers lament the weight and inconvenience of carrying them on their trips. Vendors are developing solutions to address these concerns: new classes of Internet-centric pocketable devices at the sub-$400 level; and server and Web-based applications that can be accessed from anywhere. There is also a new class of applications: portable personality that encapsulates a user's preferred work environment, enabling the user to recreate that environment across multiple locations or systems.

3. By 2012, 80 per cent of all commercial software will include elements of open-source technology: Many open-source technologies are mature, stable and well supported. They provide significant opportunities for vendors and users to lower their total cost of ownership and increase returns on investment. Ignoring this will put companies at a serious competitive disadvantage.

Embedded open source strategies will become the minimal level of investment that most large software vendors will find necessary to maintain competitive advantages during the next five years.

4. By 2012, at least one-third of business application software spending will be as service subscription instead of as product license: With software as service (SaaS), the user organization pays for software services in proportion to use. This is fundamentally different from the fixed-price perpetual license of the traditional on-premises technology. Endorsed and promoted by all leading business applications vendors (Oracle, SAP, Microsoft) and many Web technology leaders (Google, Amazon), the SaaS model of deployment and distribution of software services will enjoy steady growth in mainstream use during the next five years.

5. By 2011, early technology adopters will forgo capital expenditures and instead purchase 40 per cent of their IT infrastructure as a service: Increased high-speed bandwidth makes it practical to locate infrastructure at other sites and still receive the same response times. Enterprises believe that as service oriented architecture (SOA) becomes common 'cloud computing' will take off, thus untying applications from specific infrastructure.

This trend to accepting commodity infrastructure could end the traditional 'lock-in' with a single supplier and lower the costs of switching suppliers. It means that IT buyers should strengthen their purchasing and sourcing departments to evaluate offerings. They will have to develop and use new criteria for evaluation and selection and phase out traditional criteria.

6. By 2009, more than one-third of IT organisations will have one or more environmental criteria in their top six buying criteria for IT-related goods: Initially, the motivation will come from the wish to contain costs. Enterprise data centres are struggling to keep pace with the increasing power requirements of their infrastructures. And there is substantial potential to improve the environmental footprint, throughout the life cycle, of all IT products and services without any significant trade-offs in price or performance.

In future, IT organisations will shift their focus from the power efficiency of products to asking service providers about their measures to improve energy efficiency.

7. By 2010, 75 per cent of organisations will use full life cycle energy and CO2 footprint as mandatory PC hardware buying criteria: Most technology providers have little or no knowledge of the full life cycle energy and CO2 footprint of their products. Some technology providers have started the process of life cycle assessments, or at least were asking key suppliers about carbon and energy use in 2007 and will continue in 2008.

Most others using such information to differentiate their products will start in 2009 and by 2010 enterprises will be able to start using the information as a basis for purchasing decisions. Most others will stat some level of more detailed life cycle assessment in 2008.

8. By 2011, suppliers to large global enterprises will need to prove their green credentials via an audited process to retain preferred supplier status: Those organisations with strong brands are helping to forge the first wave of green sourcing policies and initiatives. These policies go well beyond minimizing direct carbon emissions or requiring suppliers to comply with local environmental regulations.

For example, Timberland has launched a 'Green Index' environmental rating for its shoes and boots. Home Depot is working on evaluation and audit criteria for assessing supplier submissions for its new EcoOptions product line.

9. By 2010, end-user preferences will decide as much as half of all software, hardware and services acquisitions made by IT: The rise of the Internet and the ubiquity of the browser interface have made computing approachable and individuals are now making decisions about technology for personal and business use.

Because of this, IT organisations are addressing user concerns through planning for a global class of computing that incorporates user decisions in risk analysis and innovation of business strategy.

10. Through 2011, the number of 3-D printers in homes and businesses will grow 100-fold over 2006 levels: The technology lets users send a file of a 3-D design to a printer-like device that will carve the design out of a block of resin. A manufacturer can make scale models of new product designs without the expense of model makers. Or consumers can have models of the avatars they use online.

Ultimately, manufacturers can consider making some components on demand without having an inventory of replacement parts. Printers priced less than $10,000 have been announced for 2008, opening up the personal and hobbyist markets.

Friday, February 22, 2008

How to dual boot Vista with Ubuntu

By Jonathan Schlaffer

How to dual boot Vista with Ubuntu There are two ways to run several operating systems on your computer. You can run them with virtualization inside one another, or you can install them alongside each other and boot each individually, this is the easiest method for most users.

You don’t even have to create a dual boot system with Linux, in this case, Ubuntu. It can be installed by itself without Vista or any other operating installed first. Most users purchasing a new computer will find Vista installed so this is the route I will take for this tutorial.

Let’s take a look at installing Ubuntu alongside Vista on a computer, assuming Vista is installed first as this will cover most instances.

1. You need to boot into Vista and set some hard drive space aside for your Ubuntu installation. Right click “Computer” and go to “Manage.”


From there go to “Disk Management.” Right click the drive you want to partition for dual booting and select “Shrink Volume.” In most cases you will be working with the C: drive. How much space you set aside will depend on the size of your drive but 20GB should be sufficient. Choose more space if you have it.


An optional step is to create another partition, in the same manner as above for the purposes of shared storage for Vista and Ubuntu if you want to share files between the two operating systems. Again, choose as much space as you think you’ll need or as much as you can spare. You may have to use a third party utility to format the optional partition as FAT32, Ubuntu has limited support for NTFS volumes, the default for Windows 2000, 2003 and Vista.


Once the settings are to your liking, click “Shrink.” Note, do not format the newly created partition, leave it as is. Ubuntu will later format and size it as necessary.

2. Obtain Ubuntu by downloading the ISO and burning it to a CD. You will need an image burner to do so, Nero, DeepBurner or CDBurnerXP Pro all have this capability. Once you have downloaded and burned the Ubuntu image you are ready to start the installation process.

3. Insert the CD you just burned into your optical drive and wait for it to boot. Note, if it does not boot, you will need to change the boot drive settings in your BIOS which vary between manufacturers but in general, are similar. Consult your motherboard or PC manufacturer documentation on how to change boot drive order if you are not sure.

4. Once the CD boots, select the “Start or Install” option. Ubuntu will boot to a live version of the operating system that runs off the CD. Once it has loaded, you will be able to check out the features of the operating system and compatibility with your hardware. Double click the “Install” icon on the desktop to begin the installation process.


5. The first steps of the installation process will be to select your language and location to set the appropriate time zone, after doing so, click “Forward.”


6. Set up partition for installation of Ubuntu. Remember that hard drive space you set aside in the first step, Ubuntu is now going to make use of it. The Partition Manager will now start. Its default action is to use the largest unpartioned space present on the hard drive. Do so. Advanced users are welcome to configure it manually but I cannot recommend doing so if all you want is to get it installed. Click “Forward.”


7. (Mostly optional.) Migrate settings, I usually skip this step but if you have settings you want to import into Ubuntu, do so, don’t let me stop you but I skipped over this step.


8. Enter your account information. Here you will need to select a username and password to login into your Ubuntu installation. Click forward and you will be ready to install.


9. Install. This is the last screen before the installation begins, here you will be able to review and change settings if you are not happy with them. Assuming everything is to your liking, click the “Install” button.


10. Grab a soda, make a snack and wait for the installer to finish. This shouldn’t take more than 40 minutes to an hour on the outside. On sufficiently faster systems, it may finish in as little as 20 minutes. The last phase of the installation process will be to install and configure GRUB (the bootloader). Once that is complete, you will be able to select which operating system you wish to boot, Ubuntu will be selected by default.

These settings can be changed within Ubuntu itself should you want Vista to boot first instead. This is optional, if you’ve made it this far, Ubuntu is installed and ready to use, for the most part.

There may be some lingering hardware compatibility namely with your graphics card, wireless card or some other nonsense like that. If you need help with that, any competent search engine should be able to point you towards a tutorial to fix the particular problem. But, it’s almost guaranteed it will be a graphics card or wireless hardware issue.

It’s also been a problem where users haven’t been able to set the correct output resolution for their monitor. Solutions for that depend, well, just depend, you’ll have to search online and may have to try several fixes before it’s corrected.

After all that, it’s time to enjoy Ubuntu.

StatCounter - Free Web Tracker and Counter Share and Enjoy:

Thursday, February 21, 2008

PC - Drive-Imaging: Beyond Backup

Drive-Imaging: Beyond Backup

You may never have had a hard drive fail or become so corrupted that your data was beyond retrieval, but that doesn't mean you never will. And think of this: If your system ever fails in a hardware or software disaster, will you really have time to reinstall Windows and all your applications and tweak all your OS and application settings? I'm guessing the answer is no. I certainly don't. This is why you need a drive-imaging program that backs up your complete system—including all your data and applications—and can restore it all in minutes.

Traditional backup programs (we'll be looking at those in the near future) back up your documents, photos, music, and spreadsheets. Drive-imaging programs do much more. They do back up your data, of course, but they also back up your applications, your whole Windows system, and all the low-level drivers and software that you normally never notice but without which Windows can't manage. An ordinary backup program copies your files. A drive-imaging program makes a byte-by-byte duplicate of your full hard drive (or of one or more partitions if you've divided your physical drive into multiple logical drives), maintaining the identical data structure.

Hence, if your drive fails, you can pop in a new drive and restore your system to exactly the state it was in when you made the drive image. For that matter, if your system becomes unstable because you installed software that won't uninstall cleanly, you simply write over the drive (or partition) with the stable system you were using a few days ago. You've probably noticed that Windows' System Restore often doesn't restore your system because it doesn't image your drive. Instead, it performs fancy tricks with the Windows Registry and other Windows files—tricks that work on a clean, simple system but are too complicated to perform reliably on a heavily used real-world system. Drive-imaging software, by contrast, always restores your system.

Of course, you could achieve the same result by reinstalling Windows and all your drivers and applications—if you can find them again. And then you could use a conventional backup program to restore your data. But that process would likely take a day or two, if you're lucky, and your system still won't have all the tweaks and customizations that make it your own.

Drive-imaging software is worth having even if your hard drive never fails. You can use the software to clone a single system to multiple computers. Some products let you make an image of your current Windows system and then transfer the whole system to a new computer that uses different hardware without installing Windows from scratch—something that's normally impossible when you upgrade to a new machine.

Today's imaging software can work in the background even while you're running your system. It can also create incremental backups, which speed the process by storing just the changes made to your drive since the last full backup. If your drive or system fails, you can boot from an emergency CD, then restore your system from the backup image. Most of these products provide a bootable CD that you can use to boot your machine when Windows won't boot from the hard drive, or when you don't want to boot to a corrupted Windows system. If you boot from the emergency CD, you can run the imaging software from a copy of the software on the CD itself—you don't need to use the copy installed on your hard drive. You can then restore your system from an image backup on an external, network, or DVD drive, or even from an image that you stored on a partition of your hard drive that you use for storing data.

Even when your system is running smoothly, these utilities can help you out by letting you dig into a backup image to get back an older version of any file on your system. Imaging software includes a feature that lets you "mount" an image so that it appears as a drive letter in Windows Explorer, then copy files from the mounted image to your hard drive. These products can also copy partitions from one physical hard drive to another, a feature most useful with partitions that contain data only (rather than a full Windows system, since, on a new machine, the OS usually requires an actual installation). —Next: What I Found >

What I Found

I tested five products. Four were products you have to pay for: Acronis True Image 11 Home, Paragon Drive Backup 8.5 Personal Edition, ShadowProtect Desktop 3.1, and Symantec's Norton Ghost 12.0. The fifth, DriveImage XML, was freeware (but not open source). The commercial products all provide an emergency boot CD for restoring a Windows system partition. The freeware product requires you to put on your propeller beanie and build a bootable BartPE CD, a procedure not recommended for beginners.

All the utilities here can save to the same physical hard drive that you're backing up, and can even save a copy of your Windows system partition to that very same partition—a feat that sounds impossible but isn't, although I wouldn't recommend doing so. Or rather, it's fine to do so, but not exclusively. Hard drives aren't immortal; eventually yours will fail, so you should store your backups to a hard drive that plugs into a USB, FireWire, or eSATA port, to another computer on your network, or to a Network Attached Storage (NAS) device. You can also use removable media, like Iomega REV disks or writable DVDs, though you'll be stuck feeding disks into the device. In any case, if you're serious about preserving your data, use a storage medium that you can store off-site.

The commercial products let you use Windows Explorer to browse a backed-up image just as you would a real drive on your system; DriveImage XML has its own custom file browser that works almost as well. In the commercial products, the backed-up image acts as a virtual drive that gets its own drive letter. You can view the backed-up files by double-clicking on them in Explorer, or you can copy files from the virtual drive to your real one. DriveImage XML freeware product uses a built-in file manager instead of Explorer to achieve the same results. With most of the products, this virtual drive is read-only: You can copy files from it, but you can't write to it or modify existing files. But a unique and valuable feature in ShadowProtect Desktop lets you write back to a virtual drive so that you can, for example, run a virus-removal program on a backed-up image if you later discover that a virus has been lurking on your system and got into your backups.

For safety and convenience, I divide my hard drive into multiple partitions, keeping all my documents in a single dedicated partition that's separate from the one my Windows system is on. I use drive-imaging software to make full twice-monthly and daily incremental backups of both so that I can easily recover last week's version of any file or restore my Windows partition to the state it was in before I installed the software that wrecked it. The image backup of my document partition also lets me effortlessly transfer all my documents to a new computer.

One important word of warning before you try out these products: Never install more than one on your system at the same time. These utilities use low-level disk-access features built into Windows, and if you have two simultaneously installed—even if only one is running—one or both won't work. If you decide to try out more than one, be sure to completely uninstall the first and restart your system before installing the next. —Next: What about Macs? >

What about Macs?

Of course, Mac fans are probably shaking their heads in disbelief at what Windows users put up with, in cost and complexity, to get backup imaging. Mac devotees still running OS X 10.4 Tiger must turn to third-party imaging software, just like their Windows-using colleagues, but they will find it a lot simpler. The two programs I use are the freeware Carbon Copy Cloner and SuperDuper! ($27.95, direct). Both support automatic scheduled backups, are blissfully worry-free, and have restored my Mac to perfect condition.

If you use OS X 10.5 (Leopard), you've already got drive-backup software. Just plug in a USB or FireWire external drive and the OS will offer to start backing up all your files automatically with the built-in Time Machine, a feature that keeps hourly backups of your work on the current day and retains a single daily backup for each previous day. Time Machine lets you grab any file or version of a file from the past and restore it. If your hard drive fails, you can simply boot from the OS X DVD that came with your system, fire up Time Machine, and restore your whole system. Maybe someday Windows will catch up with this kind of convenience, but don't bet on it.

Microsoft includes an image backup feature in Vista, but not one that you'll want to use in a real-world system. In Vista Business, Ultimate, or Enterprise (but not in other versions), use the Start Menu, go to All Programs, then System and Maintenance, then the Backup and Restore Center, where you'll find an option to back up your entire computer. This option creates an image backup either on a hard disk or DVDs. You can restore the image by booting from the Vista installation DVD to the Vista Recovery Environment. Unfortunately, unlike Time Machine or commercial imaging products, Vista only makes complete image backups, not automatically-scheduled incremental backups, and can only restore a system from an image stored on a local hard disk or DVDs, not from your network. Also, you can't browse through a backed-up image to extract individual files. This is where Vista gives you a few crumbs, while OS X gives you the whole cake.

I've used many of these programs on my home-office system, and they've rescued it when badly written software slowed or damaged it. Rebooting to an emergency CD, navigating to a backed-up image, and restoring my system to exactly the same condition it was in when I made my backup took just minutes. You may not go to the lengths I do to back up documents, but think about how you'd feel if you lost your photos, music, documents, and other files. And remember how much work it took to get your system to work the way you wanted. If your backup strategy doesn't use imaging software, don't wait until the sun goes down to start.

Below are links to our reviews of five of the leading drive-imaging products. As always, click the links to read the full reviews.

Acronis True Image 11 Home
This program's ability not just to perform drive imaging but also to back up and restore specific folders and settings makes it the most flexible backup utility I know. But users with complex systems should watch out for potential problems with the emergency restore CD.

DriveImage XML
This application may have limited features, but it's solid drive-imaging software, and, best of all, it's free. You'll need to dust off your propeller beanie to implement it, though.

Norton Ghost 12.0
This is flexible, powerful drive-imaging and file-backup software with an exceptionally clear interface and lots of scheduling options, but a networking problem with its emergency CD keeps it from being an Editors' Choice.

Paragon Drive Backup 8.5 Personal Edition
This is a flexible, advanced drive backup-and-restore utility. The help file can be opaque, however, and the interface could be daunting to casual users.

ShadowProtect Desktop 3.1
This software provides the fastest and smoothest backups and restores of any drive-image utility on the market, and a Vista-based emergency disk guarantees compatibility with the widest range of backup hardware. ShadowProtect Desktop 3.1 is the best such product and worth ten times its price in terms of peace of mind and flexibility.

Acronis True Image 11 Home

This program's ability not just to perform drive imaging but also to back up and restore specific folders and settings makes it the most flexible backup utility I know. But users with complex systems should watch out for potential problems with the emergency restore CD.

Backs up folders, settings, and e-mail as well as creating drive images. Supports experimenting with new software before committing changes to hard drive. Includes disk cleanup and security software.

Linux-based emergency CD gets confused by complex systems. So-called "Secure Zone" isn't secure if hard drive fails.

Acronis Inc

Price: $49.99 Direct
Type: Personal, Professional
OS Compatibility: Windows Vista, Windows XP
Tech Support: e-mail and public forum

This software performs more backup feats than any other drive-imaging software. As with its peers, it creates images of whole drives, but it can also back up individual folders, e-mails, and program settings, and it lets you try out changes to your system before deciding whether to keep them. It's slower than our Editors' Choice product, ShadowProtect Desktop, but if you need this kind of flexibility—and you don't have a complex, multidrive system—Acronis True Image 11 Home is the program to choose.

Like its commercial challengers, True Image creates full backup images of whole drives or individual partitions. It can also create regular incremental backups automatically so that you can restore your system to the state it was in when you made the last complete drive image or when you made any incremental backup. The utility creates and restores backups more slowly than ShadowProtect Desktop, but at a speed similar to that of Norton Ghost 12.0 and Paragon Drive Backup 8.5 Personal Edition. I was impressed to see that True Image, like Paragon Personal, let me restore even my Windows system drive without requiring an emergency CD: The utility rebooted, restored the system drive from the image, and then rebooted into the restored drive.

Snap Restore, a feature unique to True Image, reboots into Windows, instantly restores the files needed to run the system, and then restores the rest of the drive in the background while you continue to work. But the background activity slowed my system so badly that simply waiting for the ordinary restore process to complete and then getting back to work probably would have been faster.

True Image and Norton Ghost also let me back up e-mail from Outlook, Outlook Express, and Windows Mail as well as folders, including My Documents. In addition, both allowed me to narrow backups to specific types of files (Word documents or Excel spreadsheets, for example). Furthermore, each has a trigger feature that launches a backup whenever a user logs on or off the system or when a specified number of megabytes gets added to the hard drive. Ghost's trigger implementation is more flexible. On the other hand, True Image let me choose to back up application settings, such as my Internet Explorer favorites. None of the other products I tested allowed me to do this. These extra features will be enough to make Acronis the first choice for many.—Next: Working with True Image

Working with True Image

True Image is divided into four task categories: basic backup and restore operations, a Try&Decide feature that lets you experiment with your system without making permanent changes, disk cleanup and copying utilities, and management tools for setting up shortcuts to backup locations you want to reuse or for keeping track of existing archives. Other than four icons representing these sets of tasks, the opening screen shows only warnings about problems that may have occurred during recent tasks. Annoyingly, unless you use the menu system, you have to go back to the opening screen to switch from one task category to another. You can jump to specific tasks using the menu, but the menu organizes the tasks in categories differently from those of the opening screen—a poor UI choice that's sure to confuse.

Boxed copies of True Image ship with a Linux-based emergency CD, but even if you downloaded the utility you can burn the emergency disc yourself using a wizard in the main program. The emergency CD should work well for most users in most situations. I used it successfully to restore images from my D-Link DNS-323 Network Attached Storage unit back to the computer booted from the CD. This is exactly what I would do if my computer's Windows system had become corrupted and I had to restore it, or if that computer's drive had failed and I had to replace it.

I was taken aback, however, when I tried to restore an image I had saved on an external drive attached by a FireWire cable. The emergency CD got confused by the multiple partitions (C: through H:) on the internal hard drive in the machine from which I booted the CD. The CD mistakenly assigned the drive letter G: to the external FireWire drive and failed to list the actual drive G: when enumerating the partitions on my internal hard drive.

At this point, I turned off my machine. I wasn't about to risk my system by restoring an image using software that misunderstood my hard drive's organization. I should emphasize that this problem occurred only with a FireWire external drive, not a USB drive—but it didn't happen at all with ShadowProtect Desktop or Paragon Personal. Acronis doesn't explain this, but it seems fairly clear that the Linux version the company uses for the emergency CD has weak FireWire support. This may be because some Linux versions are taking a long time to get its FireWire support up to the level of its USB support. This exact problem probably won't affect you, but I'd still hesitate to use any product that displays problems of this kind. Even if the emergency CD worked perfectly with my current hardware, I wouldn't be confident that it would still work if I had to use it on a new computer with the latest hardware. You could ignore this warning if you could be absolutely certain that your current hardware would last forever, and that you'd never have to restore a backup image to a machine with a new motherboard or a new disk technology, but that's not something you can be certain about.—Next: Try and Decide with Try&Decide

Try and Decide with Try&Decide

A True Image option will create an area on your hard drive for storing backups, but this Secure Zone facility, as the utility calls it, is misnamed: It's secure only until your hard drive fails, and then it's probably useless. You'd be better off storing backups on one or more external drives. But the terrific feature called Try&Decide lets you experiment with your system by temporarily writing disk changes to the Secure Zone area instead of directly to your system drive.

Try&Decide let me test the effect of an update to Microsoft Office before committing myself to keeping it, and I would probably turn the feature on before making any major software changes. I couldn't detect any system slowdown when I used it, and it's a superb safety net for anyone who experiments with software. If I felt really paranoid about system changes, I could even set Try&Decide to turn itself on automatically every time I booted up. Unfortunately, though, the feature does have some notable limitations. Once you commit your changes to the hard drive, you can't return your system to an earlier state. For that, you'd need to restore from a backup image, or use a dedicated drive snapshot program like Roxio BackOnTrack 3 Suite.

Also note that while using Try&Decide, you can't save backups using Secure Zone. True Image doesn't warn you about this when a backup tries to run, though; it simply alerts you that the backup failed. Even if you probe the log files, you'll learn only that the program couldn't get exclusive access to the Secure Zone area; you won't be told that you have to turn off Try&Decide. Unless you know what's wrong, you'll be alarmed to see a red banner on the program's main screen shouting that your backups have failed but giving no reason.

If you want, you can also use the Secure Zone area to store a copy of the Linux-based True Image version so it's available whenever you boot up, even if you don't have a copy of the emergency CD. When you do this, your system prompts you at boot-up to press F11 if you want to launch True Image or simply wait a few seconds to boot normally into Windows. If you turn on this feature, True Image overwrites your hard drive's Master Boot Record (MBR), a change that won't bother most home and business users, but advanced users may be reluctant. Those who have Linux and Windows installed on the same system will have to reinstall the Linux boot loader after True Image changes the MBR. Even if you use only Windows, however, you may instinctively distrust (as I do) any third-party software that makes this kind of low-level change to your drive. It's been years since I've experienced an actual problem with third-party programs that overwrite the MBR, but I feel safer if the MBR remains in the state that Windows or the computer manufacturer left it in.

Paragon Personal has a feature called Backup Capsule that, like Secure Zone, creates a special area on a drive and stores backup images as well as a copy of the program that can be used in an emergency. It has the same potential risks as Secure Zone, but at least it doesn't call itself "secure." The Paragon Personal Backup Capsule doesn't have the equivalent of Try&Decide, though.

I've used True Image's basic backup and restore functions for years without experiencing major problems, but I have mixed feelings about the vast range of features built into the most recent versions. The program used to have the same tight focus on image backup and restore that I like in ShadowProtect Desktop, but Acronis seems to have expanded the feature set without fixing basic problems like the one I encountered with the emergency CD. If you're considering True Image, I suggest you visit the official Acronis support forum at Wilders Security Forums and read some of the problem reports submitted by users.

True Image gets more kinds of backup jobs done than any of its rivals, and if you have a plain-vanilla home or SOHO system, this product will do all you need and more—provided you don't run into the kind of compatibility issues I experienced, and which also seem to be reported often on the company's own support forums. Many difficulties that users report on the forums result from simple mistakes in reading the menus, but a number of others involve problems restoring from backups, and Acronis isn't as quick as ShadowProtect in responding to or resolving these reported problems. This is a high-quality, mature program that's surprisingly flexible, but Acronis won't be ready to challenge ShadowProtect for our Editors' Choice until it focuses more on its core functions and less on the size of its feature set.

More Backup Software Reviews:

DriveImage XML

DriveImage XML may have limited features, but it's solid drive-imaging software, and it's free.

Free. Command-line interface allows running program from batch files.

No incremental backups. Can't write to DVD. You must build your own emergency boot CD.

Runtime Software

Price: $0.00 Direct
Type: Personal
Free: Yes
OS Compatibility: Windows Vista, Windows XP

DriveImage XML does a lot less than other drive-imaging programs, but it costs nothing, and it does just as good a job as they do of the essential tasks of backing up, browsing, and restoring drive images. It does these jobs more slowly, and it can't make incremental backups (the kind that quickly supplement full backups with changes made since the initial backup), and it's only suitable for users who like building their own emergency boot CD. If you've already built a BartPE boot disk, DriveImage XML is the kind of imaging software you want. If you've never heard of BartPE and you don't know what I'm talking about, read on.

You've heard that backing up is hard to do? Not with this utility. You simply choose Backup from the main menu, select a local drive to back up (the program won't let you back up network drives), click on Next, and pick a destination (which can be any local or network drive). Like all the other products in this category, you can save the image to the same drive that you're backing up. If you opt to save to the same drive, you'll probably also want to copy the image to a safer location afterward. Though you can't save directly to DVD media, an option will split the image into multiple files suitable for copying by hand to DVDs after the backup is complete. You can't back up individual files, but I prefer whole-drive backups anyway.

DriveImage XML has one geek-level feature I like: A command-line interface lets you control the utility through an old-style batch file to fine-tune backup procedures and delete or rename existing backups. You have to write your own batch files, of course, but for experienced batch-file gurus, that's faster than using the other products' wizards. I'm not thrilled, though, that DriveImage XML won't run under Vista until you turn off User Access Control (UAC). While I dislike UAC as much as you do, it's a valuable security feature, and I'd prefer to leave it on.—Next: How DriveImage XML Works

How DriveImage XML Works

To extract files from an image, you use the browse feature from the product's main menu, which opens a tree-structured file-viewing program that looks like a ten-year-old version of Windows Explorer. I liked the toolbar that let me extract, launch, and view files. The view feature uses a Notepad-style file viewer from which you can copy text to the Windows clipboard, but I prefer the ability that competing programs give me to browse a backup image in Explorer.

Restoring an image can be tricky with DriveImage XML. You can use the program's main menu to restore partitions to your local hard drive—unless you want to restore to your system partition. In that case you have to boot to an emergency boot CD—and you have to provide that CD for yourself using the freeware, open-source BartPE (Bart Preboot Environment) devised by Bart Lagerweij and available from Don't even think about trying this if you've never at least tried on a propeller beanie: Although the process isn't especially difficult, the instructions are long and complicated, and you'll need an up-to-date Windows XP installation CD—not the restore disc that came with your mail-order computer, but a full-fledged installation CD that can install XP on any machine.

I don't exactly enjoy building BartPE discs, and I've never spent the time I would need to incorporate all the add-ons preferred by BartPE experts, but I climbed the learning curve a few years ago, and I don't mind taking a few minutes to create a new CD every now and then. All I needed to do to create a BartPE disc that would let me restore a DriveImage XML image to my system was download a DriveImage XML plug-in for BartPE from the DriveImage XML Web site, install the plug-in to my existing BartPE-building environment, and burn a new CD that includes a copy of DriveImage XML. Once I booted from my new BartPE CD, I could click on the BartPE menu, open the copy of DriveImage XML already on the CD, and back up or restore any drive in my system. If reading this last paragraph makes your head hurt, DriveImage XML probably isn't for you. If, on the other hand, it sounds old hat, you should definitely consider it.

DriveImage XML is a long-established and reliable program that's more or less certain to work without fuss or failures. I value my data enough to prefer a program with more options and the ability to make incremental backups, but if I had to use DriveImage XML instead of its competition, I certainly wouldn't complain. But then again, I'm not frightened by BartPE.

More Backup Software Reviews:

Norton Ghost 12.0

This is a flexible, powerful drive-imaging and file-backup program with an exceptionally clear interface and lots of scheduling options, but a networking problem with its emergency CD keeps it from receiving an Editors' Choice.

Simple interface with a unique calendar view of past and scheduled backups. Backs up drives and files with ultra-flexible scheduling options, including backups when a specified application launches.

Emergency disc couldn't see network on test systems.

Symantec Corporation

Price: $69.99 Direct
Type: Business, Personal, Professional
OS Compatibility: Windows Vista, Windows XP
Tech Support: phone and email

This software inherits the name of the original Ghost (the first widely used drive-imaging software released by Binary Research in 1996) but none of the code, so I wouldn't buy it based on its ancestry. On the other hand, I might recommend buying Norton Ghost 12.0 because it has the best interface of any drive-imaging competitor, and it's the only product of its kind that won't frighten a completely nontechnical user. I admire many things about it, especially the capability it gives you to have specific events trigger a backup, but I encountered enough glitches to keep me from preferring it to our Editors' Choice product, ShadowProtect Desktop 3.1.

Like all its commercial rivals, Norton Ghost can create full backups of your drives, supplemented by scheduled incremental backups that include only changes made since the last full backup. What's more, the utility can save backup images on local or network drives or on writable DVDs. Like Acronis True Image 11 Home, it can also back up specific folders or file types, a convenience that could make Norton Ghost and True Image Home vie for first choice among users who want to maintain both full drive backups and smaller file backups. Those are perfectly reasonable options for backing up, though I think up-to-date drive backups are sufficient.

When I loaded Norton Ghost's emergency boot CD, I expected to have as good an experience as I had had with the ShadowProtect Desktop emergency CD, because both products make use of the up-to-date hardware support built into Windows Vista. Unfortunately, the networking setup on the emergency CD never found my network from any of the three attached test systems, so I couldn't restore my system from my Network Attached Storage. None of the emergency CDs of the other products I was testing at the same time had this problem, which, for me, rules out Norton Ghost as a backup solution. Though many users have reported no problems, others have related similar difficulties. If you back up to a network location, you may want to try Norton Ghost before you buy it.—Next: Norton Ghost: What I Liked

Norton Ghost: What I Liked

One of things I do like about the utility is its lucid, spacious interface, which makes every option easily accessible without cluttering the screen. Norton Ghost uses the standard left-panel toolbar and large right-panel information window to lay out its features. This is more or less the way its competitors do things, too, but Ghost is especially well organized. Additionally, I was pleasantly surprised in a few areas, such as a calendar view that showed me the dates on which I had already made backups and the dates on which future backups were scheduled.

What impressed me most was an option that starts an incremental backup when specific events occur on the computer, such as a user logging on or off, a particular application being launched, or a certain amount of data being added to the drive. It's similar to a capability in True Image Home, but Norton Ghost gives you many more triggers. This feature is probably most useful in corporate settings with acres of storage space, but other users might want it as an easy way to link a backup set with specific events rather than particular times (which Norton Ghost also lets you do).

I also liked the way Norton Ghost implements its other tools. A feature that copies backup images from one location to another let me take an image I had saved to a network drive and make a copy elsewhere. Another option will break up an existing image into smaller-size chunks that can fit onto DVDs—a nice touch.

I also like the way Norton Ghost uses an internal file browser for browsing and extracting files from image backups. With other commercial products, you browse through image backups by opening them as "virtual drives" in Windows Explorer so that each backup that you open gets a drive letter of its own. If you need to browse through more than one backup, it's easy to get confused over which one is currently "Drive D" and which one is "Drive E." With Norton, you use a special-purpose file manager that always displays exactly which backup image you're viewing—but you also get the option of clicking on a toolbar if you want to open the image in Explorer, where you can use all the conveniences available in the right-click menu.—Next: Norton Ghost: What I Didn't Like

Norton Ghost: What I Didn't Like

What I didn't especially like about Norton Ghost was its slowness in writing backups to DVDs, especially compared with the speedy ShadowProtect. And when I first installed the boxed Norton Ghost CD under Vista, I was taken aback by the number of error messages the program threw at me regarding its inability to "connect" to the system—not very illuminating information for nontechnical users. The messages, and the errors that caused them, disappeared after I ran the Live Update feature to update the program on the CD to the latest version, but it was unsettling to find that the version on the CD had so much trouble with Windows Vista, which had already been released when this version of Ghost was released.

I'm also bothered by the many problem reports circulating on the Internet about failures in the program that Symantec's minimally expert tech support staff wasn't able to solve. I've learned to rely on online forums for tech support because they serve as repositories of users' experiences and real-world solutions, and I was struck by the fact that Symantec doesn't sponsor an online support forum for Norton Ghost. A Web search brought me to a lively but unofficial Ghost-users' mutual-support forum at, but the site showed no trace of participation from Symantec. The contrast between Symantec's total lack of an online forum on the one hand, and, on the other hand, the official support forum for ShadowProtect Desktop (with its almost instant responses by the vendor's own experts) was striking.

Symantec is currently working on what I'm told is a minor upgrade, Norton Ghost 14.0 (the company is skipping number 13 to avoid associating a program that's supposed to make you feel secure with an "unlucky" number). Maybe it will fix the networking problem that, for me, ruled out buying this version of the product. Norton Ghost 12.0 has lots of things to like, but if you've been considering it, you may want to wait until we find out what's improved in the next version.

More Backup Software Reviews:

Paragon Drive Backup 8.5 Personal Edition

This is a flexible, advanced drive backup and restore utility. The help file can be opaque, however, and the interface may be daunting to casual users.

Reliable drive-image backups and restores. Easy management of multiple archives for restoring old files. Powerful, reliable Linux-based emergency boot CD.

Confusing help file; interface suitable for experts only.

Paragon Software Group

Price: $49.95 Direct
Type: Business, Personal, Professional

Don't be fooled by "Personal" in the name of this powerful drive-imaging software. Paragon Drive Backup 8.5 Personal Edition looks as if it's designed for personal use by the kind of people who manage enterprise IT systems. You may prefer this package if you're comfortable with menus that let you back up your hard drive's Master Boot Record and aren't terrified by an option to back up the first track, or if you want to manage multiple backup images from a single archive menu. And if you're a techie needing to restore your system from the program's Linux-based emergency boot CD, you'll thrill to the knowledge that you can open a command line and perform advanced surgery on your drives and files using Linux's most arcane tools. Home and SOHO users who don't want to be bothered with technical details, however, should probably choose something else for their personal drive-imaging product. As a fairly experienced user, I could happily live with this product for drive backups, but I'll stick with ShadowProtect Desktop 3.1 (our Editors' Choice) for its superior speed and easy-to-use Vista-based emergency CD.

In its feature set, Paragon Drive Backup Personal falls between the needle-sharp focus of ShadowProtect Desktop and the Swiss-Army-knife approach of Acronis True Image 11 Home. Paragon creates and restores backup images of partitions or whole drives, but it doesn't offer True Image Home's ability to back up individual files or settings. Like the competition, Drive Backup Personal uses a two-pane interface, with a list of tools and tasks in a left-hand toolbar, and on the right a larger window for managing drives and backup images. I was impressed to see that I could back up my system without even installing the utility, simply by inserting the emergency CD while Windows was running and choosing the option to run the software directly from the CD without rebooting. I also liked having the option to shut down my system automatically after completing a backup—an ideal feature for anyone who likes to back up a system at the end of the day.—Next: Unhelpful Help

Unhelpful Help

Like its commercial rivals, Drive Backup Personal uses a wizard for creating backup and restore jobs, but the wizard has a split personality that bothers me a bit. I like that it keeps things simple by hiding encryption and other advanced options unless you check a box marked "Change backup settings" on the first menu.

On the other hand, users who want simple choices will be shocked when they get to the scary-looking menu where they must choose whether to back up individual partitions or the entire drive. This menu's tree-structured view of the drive includes options to back up the first track and Master Boot Record. You can ignore those choices, but non-techies could use a bit of explanatory text—or at least something in a help file.

Unfortunately, you can't access the program's help while using the backup wizard. And even when you return from the wizard to the main tabbed interface, the help system remains hard to use because it's on a tab of its own, so to read it you have to switch away from the menu with which you need help.

Worst of all, though the text looks a bit like English, it certainly doesn't read that way. This sentence is fairly typical: "To synthesize a new property modified archive based on the existed backup images of the selected disk/partition with the Synthetic Backup Wizard, simply do the following." (Don't you love that "simply"?) I think this is about merging (synthesizing) two backup images made from the same drive using different settings, but the whole Synthetic Backup feature isn't even included in the Personal edition, so the help screens about it merely added to the confusion, rather than helping to resolve it.—Next: Paragon in Action

Paragon in Action

The product backed up my test system more slowly than the commercial competitors did, and it created an image that was about 20 percent larger than either of those created by True Image Home and ShadowProtect Desktop. Drive Backup Personal supports a feature called Backup Capsule, which, like the True Image Home Secure Zone, carves out a hidden partition on your hard drive and uses it for storing backups. I don't much care for this capability: I want my backups stored on a separate drive in case my main drive becomes physically damaged or unreadable. If you don't have another drive, however, the Backup Capsule is safer than nothing.

Drive Backup Personal uses the Backup Capsule area for backups only; it doesn't match the Try&Decide feature in True Image Home, which lets you experiment with system changes before committing them to your system. As with True Image Home, Drive Backup Personal includes an option that stores a reduced copy of itself on the Backup Capsule, allowing you to boot into the reduced program and restore a damaged Windows system to a previous version. This boot-into-the-restore-program feature overwrites the Master Boot Record (MBR) of your hard drive, and I have reservations about that. It's been years since I've experienced an actual problem with third-party programs that overwrite the MBR, but I feel safer if the MBR remains in the state that Windows or the computer manufacturer left it in.

You can make two different types of emergency boot discs with Drive Backup Personal: a simple MS-DOS-based one that you can burn by choosing an option in the Drive Backup program, and a full-fledged Linux-based emergency disc that you can download as a disc image from the Paragon Web site and then burn to a CD. The DOS-based CD launches a program that performs basic restore and disk-management options on your internal hard drive but can't access backups on external drives or a network. This is the same program that gets stored on the Backup Capsule for emergency use. The DOS program listed a nonexistent drive B: on my test system as a possible source of archives, and when I selected that drive by mistake, the program locked up.

I was a lot more impressed with the full-fledged Linux-based disc, which managed external drives without getting confused by multiple partitions on the internal drive the way the True Image Home Linux-based disc did. You'll need some experience with networking and an understanding of basic Linux to use all the features on the full-fledged Paragon emergency CD, and some of the disc's menus—especially in its networking setup—aren't for the fainthearted.

I was very impressed with Paragon Drive Backup 8.5 Personal Edition, and I think any system administrator would be glad to have a copy of its Linux-based emergency CD. While it may be a bit intimidating for less-savvy users, this highly reliable and powerful utility is a good choice for those who need advanced capabilities. It's a close second to our Editors' Choice, the more broadly appealing ShadowProtect Desktop.

More Backup Software Reviews:

ShadowProtect Desktop 3.1

This software provides the fastest and smoothest backups and restores of any drive-image utility on the market, and a Vista-based emergency disk guarantees compatibility with the widest range of backup hardware. ShadowProtect Desktop 3.1 is the best such product and worth ten times its price in terms of peace of mind and flexibility.

Fast, reliable image backups to local and network drives. Easy restores, even to different hardware. Plentiful scheduling and security options. Can mount images in VMware Workstation or Microsoft Virtual PC.

Can't back up specific groups of files.

StorageCraft Technology Corporation

Price: $79.00 Direct
Type: Business, Personal, Professional
OS Compatibility: Windows Vista, Windows XP
Tech Support: email and public forum

What I want from backup software is simple: It should work perfectly and fast—and that's exactly what ShadowProtect Desktop 3.1 did on my tests. Maybe the reason this utility is so good at backing up hard drives and restoring whole systems, individual drives, and individual files is that it doesn't try to do anything else. The utility lacks some of the fancy features trumpeted by its commercial competitors, like Acronis True Image 11 Home and Paragon Drive Backup 8.5 Personal Edition, but I prefer it over all the alternatives, because it does what it's designed to do with unparalleled speed and reliability. And its interface, while not as spectacular as its performance, is clear enough to get the job done.

ShadowProtect Desktop creates full and incremental backup images, either on schedule or on demand, and can save them to internal and external hard disks, CDs, DVDs, Blu-ray media, network drives, and Network Attached Storage (NAS) units. It restores images to your hard drive—or to a different machine—when you need to revive a nonworking system or simply want to go back to an earlier version of a mildly messed-up one.

ShadowProtect Desktop performs the same basic functions as True Image Home and Paragon Drive Backup, but it's breathtakingly fast compared with its rivals. It's also the only program of its kind that creates writable backup images. In Windows Explorer I was able to open a backed-up drive image, modify files, run a virus remover on them, or use any other software to manipulate them—and then save the image in its modified form. Rival products, by contrast, can open backed-up images strictly as read-only drives, so you can copy files out but can't change anything inside. Best of all, ShadowProtect Desktop simply works without surprises no matter what hardware I use.—Next: Interfacing with ShadowProtect

Interfacing with ShadowProtect

The interface is standard for this software category—a sidebar on the left for choosing basic tasks and a large panel on the right for viewing backup operations in progress, studying a map of your drives, and giving access to other features. This product isn't for complete beginners, but you don't need to be an expert, either. Setting up a scheduled automated backup routine takes only a few seconds. For my hard drive, I used the wizard interface to schedule monthly image backups and daily incremental backups that saved to a D-Link DNS-323 NAS unit. Every few weeks I also make full backups to a small stack of writable DVDs and a USB-attached external drive.

ShadowProtect Desktop ships on a CD that can install the utility in Windows. The same disc can also work as a bootable emergency CD when I need to restore a drive from a backed-up image, retrieve files from a stored image, or simply retrieve files from a machine that won't boot from Windows. The software uses a reduced version of Vista (the Vista Pre-boot Environment, or Vista PE) for its bootable disc, which means that the emergency CD can access every kind drive Vista can. This makes for much smoother operations than the Linux-based emergency CD used by True Image Home, which was confused by my complex system. Also, when backing up and restoring a Windows-based system, I'm simply more comfortable with an emergency CD based on Vista rather than Linux—even Paragon Drive Backup's excellent Linux-based emergency disc.

Unlike True Image Home, ShadowProtect Desktop doesn't come with software that creates a bootable emergency CD for you if you don't have the original, but registered users can download a burnable ISO image from StorageCraft's Web site and create a bootable CD at any time. Nor will you find a feature like "Secure Zone" or "Backup Capsule," offered by True Image Home and Paragon Drive Backup, that will store a backup to the drive being backed up. But such zones won't protect your data if your drive suffers a physical failure. ShadowProtect Desktop will let you save the images from one partition to another partition on the same physical drive, which is handy. But the app is really designed for storing backup images on separate media that you can safely store. After all, if your machine gets nuked, you'll probably lose both partitions. Still, if you merely overwrite a key file by accident, having a local backup is useful, too.—Next: Hardware Independent Restoration: Very Cool

Hardware Independent Restoration: Very Cool

The software's most impressive unique feature is its Hardware Independent Restore (HIR), which runs only from the emergency CD. With HIR, you can take an image made on one computer, restore it to another that has entirely different hardware, and be confident that the restored image will boot. You'll probably need to install sound and network drivers for the new system, but you'll be able to boot—a feat you'll rarely manage when you transfer an existing system to a very different machine. Also, you'll be able to use all your existing software and settings without going to the trouble of installing all your programs again.

This feature takes most of the headaches out of upgrading to a new computer. I tested the capability by restoring my normal system to a VMware Workstation virtual machine, and the process was amazingly smooth. It took only 20 minutes to restore a copy of my system and probably would have taken even less time to perform the same trick with real hardware.

I was pleased and surprised to find out that with the two most widely used virtual-computer programs, VMware Workstation and Microsoft's freely downloadable Virtual PC 2007, I didn't even need a copy of ShadowProtect to boot a drive image—provided, of course, that the image was an image of a bootable drive. I could simply select the drive image as a new machine in VMware Workstation or Virtual PC, wait a few moments, and the image would boot up as a virtual machine. It sounds complicated, but it required only a few mouse clicks.

To try out this feature, I opened VMware Workstation, clicked on "Open an Existing VM," and selected a ShadowProtect image of a bootable drive from the backups stored on my network-attached storage device. VMware took less than a minute to create all the additional files needed to launch the image as a virtual machine, displayed a few unimportant warning messages, and booted the image file.

Other features I like in ShadowProtect Desktop include an Image Management tool that lets me split a large image into smaller files to simplify file transfer and that lets me combine a full backup and a sequence of incremental backups into a single file that reflects the state of my system on the date of the last incremental backup. This kind of fine-tuning isn't available on rival products.

The product isn't absolutely perfect—for example, you can't carry out every task from the keyboard: I had to use the mouse to click on the filename the software suggested for a backup image before the program would let me rename it. Also, the utility tends to be conservative in estimating the time needed to complete a backup: It told me that a backup to DVD would take 2 hours and then completed the task in 15 minutes. But those were the worst faults I could find.

What matters to me most is that the software is fast, reliable, and—for even moderately experienced users—almost effortless. The support forum at StorageCraft's Web site reports impressively few problems, and a StorageCraft engineer always responds within a few hours with either a way to fix a problem or with a promise to get it fixed in the next update. ShadowProtect makes me feel more secure about my system than I ever did before. I wouldn't run my computers without it, and I think you shouldn't either.

More Backup Software Reviews:

Copyright (c) 2008Ziff Davis Media Inc. All Rights Reserved.

website page counter