Sunday, March 30, 2008

Google has lots to do with intelligence

Sunday, March 30, 2008

When the nation's intelligence agencies wanted a computer network to better share information about everything from al Qaeda to North Korea, they turned to a big name in the technology industry to supply some of the equipment: Google Inc.

The Mountain View company sold the agencies servers for searching documents, marking a small victory for the company and its little-known effort to do business with the government.

"We are a very small group, and even a lot of people in the federal government don't know that we exist," said Mike Bradshaw, who leads Google's federal government sales team and its 18 employees.

The strategy is part of a broader plan at Google to expand beyond its consumer roots. Federal, state and local agencies, along with corporations and schools, are increasingly seen by the company as lucrative sources of extra revenue.

In addition to the intelligence agencies, Google's government customers include the National Oceanographic and Atmospheric Administration, the U.S. Coast Guard, the National Highway Traffic Safety Administration, the state of Alabama and Washington, D.C.

Many of the contracts are for search appliances - servers for storing and searching internal documents. Agencies can use the devices to create their own mini-Googles on intranets made up entirely of government data.

Additionally, Google has had success licensing a souped-up version of its aerial mapping service, Google Earth. Agencies can use it to plot scientific data and chart the U.S. coastline, for example, giving ships another tool to navigate safely.

Spy agencies are using Google equipment as the backbone of Intellipedia, a network aimed at helping agents share intelligence. Rather than hoarding information, spies and analysts are being encouraged to post what they learn on a secure online forum where colleagues can read it and add comments.

"Each analyst, for lack of a better term, has a shoe box with their knowledge," said Sean Dennehy, chief of Intellipedia development for the CIA. "They maintained it in a shared drive or a Word document, but we're encouraging them to move those platforms so that everyone can benefit."

Like Wikipedia

The system is modeled after Wikipedia, the public online, group-edited encyclopedia. However, the cloak-and-dagger version is maintained by the director of national intelligence and is accessible only to the CIA, FBI, National Security Agency and an alphabet soup of other intelligence agencies and offices.

Agents can log in, depending on their clearance, to Intellipedia's three tiers of service: top secret, secret and sensitive but unclassified. So far, 37,000 users have established accounts on the network, which contain 35,000 articles encompassing 200,000 pages, according to Dennehy.

Google supplies the computer servers that support the network, as well as the search software that allows users to sift through messages and data.

Dennehy declined to asses the quality of Google's products, but he applauded the contribution that Intellipedia can make to the government's work. Whether the network actually leads to better intelligence will largely depend on agents sharing some of their most important files and then their colleagues chiming in with incisive commentary - issues that are out of Google's hands.

Normally, Google ranks results on its consumer site by using the number of links to a Web page as a barometer of its importance. Doing so on Intellipedia isn't as effective because the service lies behind a firewall and is used by a limited number of people.

Instead, material gets more prominent placement if it is tagged, or appended by the network's users, with descriptive keywords.

Because of the complexities of doing business with the government, Google uses resellers to process orders on its behalf. Google takes care of the sales, marketing and management of the accounts.

Conspiracy theories

Google is one of many technology vendors vying for government contracts.

A single deal can be sizable, such as the one Google made with the National Security Agency, which paid more than $2 million for four search appliances plus a support agreement, according to a contract obtained through a Freedom of Information Act request.

However, the amount is insignificant when measured against Google's overall revenue of $16.6 billion last year, virtually all of which came from online advertising.

On occasion, Google is the target of conspiracy theories from bloggers who say it is working with spy agencies more closely than simply selling search equipment.

The buzz got so loud two years ago that Matt Cutts, who leads Google's fight against spam Web sites, responded by ridiculing the idea in his personal blog.

Google's Bradshaw emphasized that the company sells virtually the same products to companies as it does to government agencies. Google can make minor tweaks to comply with government rules about equipment security, for example, while major customization is handled by others.

"There were some wild accusations," Bradshaw said. "But everything we do with the government is the same as what we do with our corporate customers."

E-mail Verne Kopytoff at

This article appeared on page C - 1 of the San Francisco Chronicle.

It's Too Darn Hot

From BusinessWeek, March 30, 2008.

The huge cost of powering—and cooling—data centers has the tech industry scrambling for energy efficiency

A 35-minute drive south of Iceland's capital of Reykjavik lies the tiny fishing village of Grindavik. One January day, Kristinn Haflioason steers his car a few minutes out of town to a vast, snow-swept expanse of volcanic rock that juts out into the Atlantic Ocean. He climbs out and launches into an unlikely sales pitch that he hopes will persuade corporations from the U.S. and Europe to locate operations there. "Dozens of companies have expressed interest," he says.

This is no joke. Haflioason works for Invest in Iceland, a government agency. He's pitching the desolate spot outside of Grindavik as a site for data centers, the sprawling facilities chock-full of computers that tech companies build to handle the swelling oceans of digital information.

It's a testament to the challenges companies face in operating data centers that Google (GOOG), Yahoo! (YHOO), and Microsoft (MSFT) have all checked out this remote corner of the world (although none has made a commitment so far). The reason: Iceland has a rare combination of vacant land, cheap geothermal energy, and chilly climate that makes cooling a data center nearly free.

The tech industry is facing an energy crisis. The cost of power consumption by data centers doubled between 2000 and 2006, to $4.5 billion, and could double again by 2011, according to the U.S. government. With energy prices spiking, the challenge of powering and cooling these SUVs of the tech world has become a major issue for corporations and utilities. "The digital economy is pervasive," says Andy Karsner, Assistant U.S. Energy Secretary for energy efficiency. "The demands for computing will grow exponentially, but electric consumption can't grow the same way."

The race is on to come up with creative solutions. Companies are scouring the globe for new technologies and advantageous locations. Iceland may have the ideal climate; Saudi Arabia may offer the lowest energy costs. Every company in the business is looking to squeeze expenses in hopes of becoming the low-cost producer in the Digital Age.

Where will the breakthroughs come from? Utilities, construction companies, and tech outfits all are working on the issue. Bruno Michel, a researcher at IBM (IBM)'s Zurich lab is developing ways that the biology of the human body can be translated into cooling systems for computers. Mark Bramfitt, principal program manager at Pacific Gas & Electric (PCG), is experimenting with incentives to curb overall energy use. Just a few watts per computer can add up. An efficient data center uses about 25% less electricity than a run-of-the-mill one. In a midsize facility, that could amount to $4.5 million a year in savings.


The modern data center is like a vast refrigerator with hundreds or thousands of ovens blazing away inside. Six-foot-tall metal racks stacked with pizza box-size computers, storage devices, and network-routing machines are lined up in rows. Chilled air blows through the equipment from vents in the floors of "cold aisles." Hot air blows out of the back ends into "hot aisles" and is drawn off and vented out of the building. Inside the centers, there's a dull roar as large quantities of air shoot through ducts, vents, and computers.

So intense is the competition among tech companies to lower their costs of processing data that some treat information about their energy use like state secrets. When Google built a data center along the Columbia River in Oregon a few years ago, it bought the land through a third party so its involvement was hidden, and the city manager had to sign a confidentiality agreement. In North Carolina, the state's sunshine laws forced it to disclose the incentive package it offered Google to locate a data center there, but the company's plans for power consumption were redacted as trade secrets. Little wonder, perhaps: In the future, the competition between Google and Microsoft in the Web search business might be determined as much by data center energy efficiency as by which company writes the best search algorithm.

Think of the data center as the factory for the info economy. Every day, in thousands of nondescript buildings in the hinterlands or in the basements of office towers, trillions of chunks of information coded in ones and zeros are moved, stored, and assembled into new shapes. Those electronic packets include millions of e-mails, Facebook pages, blog entries, and YouTube (GOOG) videos; vast quantities of electronic transactions; plus an ever-expanding universe of data that needs to be stored, sliced, diced, and analyzed. Tech market researcher IDC (IDC) estimates that, each year, the digital world creates about 3 million times the information contained in all the books ever written.

People don't typically think of information as having substance, but the essence of computing is physics. When bits of data in the form of tiny electrical charges move across the wires of semiconductor chips, with millions of transistors switching on and off, they meet resistance. Overcoming resistance requires energy and creates heat. (If they're run too fast, chips won't burst into flames, but they will crack like an overheated automobile engine block.) Move enough bits around, and you have one huge bill for powering and cooling the equipment: The $4.5 billion spent in the U.S. in 2006 is the equivalent of the electric bills for 5.8 million U.S. households.

Energy consumption in data centers emerged as a major issue in the past couple of years. According to a January survey of its members by AFCOM, an association of data center managers, 88.5% of them described energy consumption as a serious or very serious concern, up from 36.1% five years ago. "You're realizing the cost of all that power. You look for opportunities to reduce costs, but now the cost of power is soaring," says Sal Cefalu, a vice-president for data center operations at telecom giant Verizon (VZ), which has 18 data centers nationwide. These days, for every $1 spent on computing equipment in data centers, an additional 50 cents is spent each year to power and cool them. About half of the energy is for air conditioning.


Three years ago, Microsoft began to tackle the energy issue in earnest. CEO Steve Ballmer and other top executives had decided to greatly expand the services offered over the Web to consumers and businesses, including e-mail and instant messaging, to parry threats from Google. So a group of about 20 people gathered at an off-site meeting at the Salish Lodge in Redmond, Wash., with a view of a waterfall out the window, to hash out a forecast of data center needs. "We decided to take the most aggressive, craziest plan we could come up with," says Michael Manos, senior director of data center services. "In hindsight, we were conservative."

Image Hosted by

Since then, Microsoft has spent more than $2 billion building four gigantic facilities—in Chicago, Dublin, San Antonio, and Quincy, Wash. Now it's looking at locations overseas, including Iceland and Siberia. Its computing needs have doubled each year for the past three years, and Microsoft expects them to continue doubling each year for the foreseeable future.

Energy consumption is a major factor in Microsoft's planning. The company has created what it calls a "heat map" of the globe that takes into account 35 factors in site selection, including cost and availability of power. A group of scientists studies the latest advances in data center design and computing efficiency. And Manos looks for local resources he can take advantage of—such as water from a waste-treatment system that cools computers at the new data center in San Antonio. As a result of all of this effort, Manos believes his 20 data centers are 30% to 50% more efficient than the industry average.

Solutions to tech's energy crisis won't come easily. The largest data centers can cost more than $200 million to construct and are expensive to upgrade to the latest energy-saving technology.

There are a number of relatively quick fixes, however. Most server computers use only about 10% of their capacity at any given time, so data center operators are using software that shifts computing jobs between servers to make the most of their capabilities. New power management systems shut off servers and other equipment automatically when not in use. And most computer companies have caught the green religion. They claim their newest equipment is ultra-energy-efficient.

The problem is that widely accepted yardsticks don't exist for comparing one server computer with another to show which is the more green. For servers or data centers, there's no equivalent of the miles-per-gallon ratings for cars. The Environmental Protection Agency and Energy Dept. are working with industry groups to set up benchmarks.

Although more progress is needed, there have been some major advances in data center energy conservation. In each case, the breakthroughs were the result of scientists and others questioning the conventional wisdom in their industries.

Take the design of the chips themselves. A decade ago the chip industry had a single focus: making chips process bits and bytes ever faster. But Marc Tremblay, a top scientist at Sun Microsystems (JAVA), saw a fatal flaw in that strategy. The faster chips ran, the hotter they got, and eventually they'd be too hot to work properly. So he designed a so-called multicore chip, which has several processors on a single sliver of silicon. Each core runs slower and cooler than a processor that does the same amount of work all by itself.


Because of the complexities of semiconductor technology, it took nearly a decade for Tremblay's innovation to come to market. Since Sun's server computers based on Tremblay's designs went on sale two years ago, they have helped turn the company around and have dramatically slashed energy consumption. Each chip in those servers consumes just 70 watts of power, about one-third that of a conventional microprocessor.

Even the traditionally slow-moving utility industry is showing signs of surprising change. Take PG&E. Two years ago, after Bramfitt was handed the job of designing conservation incentives for Northern California's tech companies, he came up with a creative solution. He decided to provide financial incentives to companies that decide to use what's known as virtualization software. Without the software, each server computer typically handles only one program at a time, and only a fraction of the capacity is used. With virtualization software, many programs can be run on a single computer.

The result is that managers can pack more programs onto each server, thus burning less electricity and reducing the number of computers needed. Under Bramfitt's incentive program, PG&E pays customers for every kilowatt-hour of energy they save using the software.

Bramfitt's program is a business model innovation, but most of the data center energy savings are likely to be found in technology advances. IBM researcher Michel is focusing on a seldom-trod territory—where biology and physics meet. Michel, who has a PhD in biochemistry from the University of Zurich, is designing devices that he expects will one day cool chips with a system modeled after the human body. While the processors in server computers are typically cooled with air, these chips are chilled with a liquid delivered through a system similar to the body's capillaries. One of Michel's inventions is a metal cap that fits over a processor and sprays jets of water out of some 50,000 nozzles into microscopic channels etched in the metal. The channels circulate the liquid efficiently and cut the amount of energy required to pump the water.

While there have been substantial improvements, more innovations will be necessary in fields from technology to utilities. Putting data centers in Iceland or Siberia may help, but that on its own won't be enough to solve tech's energy crisis.

With Kerry Capell in Grindavik, Iceland

Monday, March 24, 2008

An extract from Michio Kaku on the science behind UFOs and time travel

In 1600, the former Dominican monk and philosopher Giordano Bruno was burnt alive in the streets of Rome. To humiliate him, the Church first hung him upside down and stripped him naked. What made the teachings of Bruno so dangerous? He had asked a simple question: is there life in outer space? Rather than entertain the possibility of billions of saints, popes, churches, and Jesus Christs in outer space, it was more convenient for the Church simply to burn him.

For 400 years the memory of Bruno has haunted the historians of science. But Bruno has his revenge every few weeks: about twice a month a new extrasolar planet is discovered orbiting a star: more than 250 such planets have now been documented. Bruno’s prediction of extrasolar planets has been vindicated. But one question lingers. Although the Milky Way may be teaming with extrasolar planets, how many of them can support life? And if intelligent life does exist, what can science say about it?

Some people claim that extraterrestrials have already visited Earth in the form of UFOs. Scientists usually dismiss the possibility of UFOs because the distances between stars are so vast. But last year the French government released a report by the French National Centre for Space Studies, which included 1,600 UFO sightings spanning 50 years, including 100,000 pages of eyewitness accounts, films and audiotapes. The French government stated that nine per cent of these sightings could be fully explained, that 33 per cent had likely explanations, but that it was unable to follow up on the rest.

The most credible cases of UFOs involve a) multiple sightings by independent, credible eyewitnesses and b) evidence from multiple sources, such as eyesight and radar. For example, in 1986 there was a sighting of a UFO by JAL flight 1628 over Alaska, which was investigated by the Federal Aviation Administration. The UFO was seen by the passengers of the JAL flight and was also tracked by ground radar. Similarly, there were mass radar sightings of black triangles over Belgium in 1989-90 that were tracked by Nato radar and jet interceptors. In 1976, there was a sighting over Tehran, that resulted in multiple systems failures in an F-4 jet interceptor. But what is frustrating to scientists is that, of the thousands of recorded sightings, none has produced hard physical evidence that can lead to reproducible results in the laboratory. No alien DNA, alien computer chip or physical evidence of a landing has ever been retrieved.

We might ask ourselves what kind of spacecraft they would be. Here are some of the characteristics that have been recorded by observers.

a) They are known to zig-zag in midair;

b) They have been known to stop car ignitions and disrupt electrical power;

c) They hover silently.

None of these characteristics fits the description of the rockets we have developed on Earth. For example, all known rockets depend on Newton’s third law of motion (for every action, there is an equal and opposite reaction); yet the UFOs cited do not seem to have any exhaust. And the g-forces created by zig-zagging flying saucers would exceed 100 times the gravitational force on Earth - the g-forces would be enough to flatten any creature on Earth.

Can such UFO characteristics be explained using modern science? In movies it is always assumed that alien beings pilot these craft. More likely, however, if such craft exist, they are unmanned (or are manned by a being that is part organic and part mechanical). This would explain how the craft could execute patterns generating g-forces that would normally crush a living being.

Any alien civilisation advanced enough to send starships throughout the universe has certainly mastered nanotechnology. This would mean that their starships do not have to be very large; they could be sent by the millions to explore inhabited planets. Desolate moons would perhaps be the best bases for such nanoships. If so, then perhaps our own moon has been visited in the past by a civilisation similar to the scenario depicted in the movie 2001: A Space Odyssey, which is perhaps the most realistic depiction of an encounter with an extraterrestrial civilisation.

Some scientists have scoffed at UFOs because they don’t fit any of the gigantic propulsion designs being considered by engineers today, such as ramjet fusion engines, huge laser-powered sails and nuclear pulsed engines, which might be miles across. But UFOs can be as small as a jet aeroplane, and can refuel from a nearby moon base. So sightings may correspond to unmanned reconnaissance ships.

Time is one of the great mysteries of the universe. We are all swept up in the river of time against our will. Around AD400, Saint Augustine wrote extensively about the paradoxical nature of time: ‘How can the past and future be, when the past no longer is, and the future is not yet? As for the present, if it were always present and never moved on to become the past, it would not be time, but eternity.’ If we take Saint Augustine’s logic further, we see that time is not possible, since the past is gone, the future does not exist, and the present exists only for an instant.

In 1990, Stephen Hawking read papers of his colleagues proposing their version of a time machine, and he was sceptical. His intuition told him that time travel was not possible because there were no tourists from the future. If time travel were as common as taking a Sunday picnic in the park, then time travellers from the future should be pestering us with their cameras. There ought to be a law, he proclaimed, making time travel impossible. He proposed a ‘Chronology Protection Conjecture’ to ban time travel from the laws of physics in order to ‘make history safe for historians’.

The embarrassing thing, however, was that no matter how hard physicists tried, they could not find a law to prevent time travel. Apparently, time travel seems to be consistent with the known laws of physics. Unable to find any physical law that makes time travel impossible, Hawking recently changed his mind. He made headlines when he said, ‘Time travel may be possible, but it is not practical.’

Time travel to the future is possible and has been experimentally verified millions of times. If an astronaut were to travel near the speed of light, it might take him, say, one minute to reach the nearest stars. Four years would have elapsed on Earth, but for him only one minute would have passed, because time would have slowed down inside the rocket ship. Hence he would have travelled four years into the future, as experienced here on Earth. (Our astronauts actually take a short trip into the future every time they go into outer space. As they travel at 18,000 miles per hour above the Earth, their clocks beat a tiny bit slower than clocks on Earth. The world record for travelling into the future is held by the Russian cosmonaut Sergei Avdeyev, who orbited for 748 days and was hence hurled .02 seconds into the future.) So a time machine that can take us into the future is consistent with Einstein’s special theory of relativity. But what about going backwards in time?

If we could journey back into the past, history would be impossible to write. As soon as a historian recorded the history of the past, someone could go back into the past and rewrite it. Not only would time machines put historians out of business, but they would enable us to alter the course of time at will. If, for example, we were to go back to the era of the dinosaurs and accidentally step on a mammal that happened to be our ancestor, perhaps we would accidentally wipe out the entire human race. History would become an unending, madcap Monty Python episode, as tourists from the future trampled over historic events while trying to get the best camera angle.

But perhaps the thorniest problems are the logical paradoxes raised by time travel. For example, what happens if we kill our parents before we are born? This is a logical impossibility. It is sometimes called the ‘grandfather paradox’.

There are three ways to resolve these paradoxes. First, perhaps you simply repeat past history when you go back in time, therefore fulfilling the past. In this case, you have no free will. You are forced to complete the past as it was written. Thus, if you go back into the past to give the secret of time travel to your younger self, then it was meant to happen that way. The secret of time travel came from the future. It was destiny. (But this does not tell us where the original idea came from.)

Second, you have free will, so you can change the past, but within limits. Your free will is not allowed to create a time paradox. Whenever you try to kill your parents before you are born, a mysterious force prevents you from pulling the trigger. This position has been advocated by the Russian physicist Igor Novikov. He argues that there is a law preventing us from walking on the ceiling, although we might want to. Hence, there might be a law preventing us from killing our parents before we are born.

Third, the universe splits into two. On one timeline the people whom you killed look just like your parents, but they are different, because you are now in a parallel universe. This latter possibility seems to be the one consistent with the quantum theory.

The film Back to the Future explored the third possibility. Doc Emmett Brown (Christopher Lloyd) invents a plutonium-fired DeLorean car, which is actually a time-machine for travelling to the past. Marty McFly (Michael J. Fox) enters the machine and goes back and meets his teenage mother, who then falls in love with him. This poses a sticky problem. If Marty’s teenage mother spurns his future father, then they never would have married, and he would never have been born.

The problem is clarified a bit by Doc Brown. He goes to the blackboard and draws a horizontal line, representing the timeline of our universe. Then he draws a second line, which branches off the first line, representing a parallel universe that opens up when you change the past. Thus, whenever we go back into the river of time, the river forks into two, and one timeline becomes two timelines, or what is called the ‘many worlds’ approach.

This means that all time-travel paradoxes can be solved. If you have killed your parents before you were born, it simply means you have killed some people who are genetically identical to your parents, with the same memories and personalities, but they are not your true parents.

Tips on hiring direct from Dell


At the age of 15, Michael Dell broke down a brand-new Apple 2 computer and rebuilt it, just to see if he could.

When the young computer enthusiast was 18, his father told him: "You've got to stop with this computer stuff and concentrate on school. Get your priorities straight. What do you want to do with your life?"

"I want to compete with IBM," he declared.

The following year, 1984, he dropped out of college and started up a computer company with $1,000.

Today Dell is a leading global systems and services company with revenue of $61.1 billion.

The story is recounted in Direct from Dell, written by Michael Dell and Catherine Fredman, among the many lessons on entrepreneurship, leadership, management and the direct business model.

Dell is a customer-driven company, and its founder's definition of "best customers" is very interesting.

His best customers aren't necessarily the largest, the ones that buy the most, or the ones that require the least help or service. His best customers, he says, are those from whom the company can learn the most, who teach it ways to add value beyond its existing products or services.

At Dell, they call this adding value "beyond the box". The best customers act as leading indicators for where the market is going. They raise the bar, encouraging Dell to continually evolve from a company that sells components of a solution to a company that provides the entire solution.

Michael Dell is also keen on recruitment. No matter where you are in the lifecycle of your business, he believes, bringing in great talent should always be a top priority. It's also one of the hardest objectives to meet.

People who thrive at Dell are results-oriented, self-reliant, and driven to lead. The company gives them the authority to drive the business in a particular direction, and provides them with the tools they need to do so.

Whether you're hiring someone in an entry-level position or to run one of your largest groups, that person must be completely in sync with the company's business philosophy and objectives. If people think in a way compatible with your company's values and beliefs, they will not only work hard to fulfill immediate goals, but will also contribute to the greater goals of the organisation.

If you hire people with the potential to grow far beyond their current position, you build depth and additional capability into your organisation. That's critical when you face the next wave of growth or the next competitive challenge.

Dell recruits for succession. Everyone's job includes finding and developing their successor - not just when they are ready to move into a new role, but as an ongoing part of their performance plan.

What should you look for in today's candidates to ensure tomorrow's leadership?

At Dell they look for people who possess the questioning nature of a student and are always ready to learn something new. Because so much of what has contributed to Dell's success goes against conventional wisdom, it looks for those who have an open, questioning mind; a healthy balance of experience and intellect; people who aren't afraid to make a mistake in the process of innovation; and people who expect change to be the norm and are liberated by the idea of looking at problems from a different angle and coming up with unprecedented solutions.

Michael Dell not only interviews top-level people, but when he has time he will talk with prospective interns. He wants to learn their viewpoint and perspective. If they have a good rationale for business they will be hired.

When he interviews people, the first thing he wants to learn is how they process information. Are they thinking in economic terms? What is their definition of success? How do they relate to people? Do they really understand the strategy of the business they're involved in today? Do they understand Dell's?

He usually asks about something the candidates did and are proud of. This gives a few insights into whether they focus on the success of the company or personal gain. Then he makes a point of actively disagreeing with them. He wants to know if they have strong opinions and are willing to defend them.

At Dell, the need is for people who are confident enough of their own abilities and strong in their convictions, not people who feel the need to agree in the face of conflict.

Kriengsak Niratpattanasai provides executive coaching in leadership and diversity management under the brand TheCoach. He can be reached at Copies of previous columns are available at

Tuesday, March 18, 2008

Three great resources for Linux noobs

Posted by Nanci Barthelmess on 18 March 2008 from I'm Just An Avatar - Nancy Barthelmess' Blog.

While checking out the Ubuntu Live Stats site the last few days I came across some pages with great info for people who are new to Linux as well as a great checklist of things to do when upgrading.

First off let’s start with one for the visual learners. The Lab Rats over at Ubuntu Video News have a great video introduction to Linux and Ubuntu. This isn’t the same as the Ubuntu Screencasts, as this video actually shows the two hosts who are giving you the introduction, as well as what’s on their notebook screens. If the video isn’t available when you try to watch it, please check back again later for it. It’s worth the wait.

Mackenzie Morgan also has a great tutorial that isn’t overwhelming for people who haven’t actually tried Linux. It’s called What’s this “Linux” thing and why should I try it? and it’s something I wish I had read when Peng got me to switch to Linux from WinXP. It would have helped me switch even sooner.(Matthew Helmke posted it to his blog, which is where I saw ithis.Sorry about the mistake, Mackenzie.)

And with Ubuntu Hardy about a month away there’s a list of ten things to do to help you make the upgrade to Hardy easier. I wonder if Peng knew about this when he made the upgrade last week?

Monday, March 17, 2008

Open Source Issue of Computers in Libraries

Open Source Issue of Computers in Libraries (March 2008)

March 17, 2008

De Groff, A. (2008). Using Open Source to Give Patrons WHAT THEY WANT. Computers in Libraries 28(3): 6-10, Retrieved March 13, 2008, from Academic Search Premier database.

(2008). The Community Behind the Code. Computers in Libraries 28(3): , Retrieved March 13, 2008, from Academic Search Premier database.

Balas, J. L. (2008). Open Source Becomes More Accessible. Computers in Libraries 28(3): 32, Retrieved March 13, 2008, from Academic Search Premier database.

Breeding, M. (2008). Making a Business Case for Open Source ILS. Computers in Libraries 28(3): 36-39, Retrieved March 13, 2008, from Academic Search Premier database.

Chudnov, D. (2008). What Librarians Still Don’t Know About Open Source. Computers in Libraries 28(3): 40-43, Retrieved March 13, 2008, from Academic Search Premier database.

Gordon, R. S., & West, J. (2008). What Can Open Source Do for You?. Computers in Libraries 28(3): 44-45, Retrieved March 13, 2008, from Academic Search Premier database.

These recently published articles in Computers in Libraries could be useful to someone in the library community who wishes to get a quick glimpse at some of the uses of OSS in libraries, as well as some of the arguments for and against. The very fact that such a substantial part of this issue is devoted to OSS is indicative of how prevalent at least some of these solutions have become in libraries.

In his aptly called “What Librarians Still Don’t Know About Open Source,” Chudnov graciously alerts us to the one phrase one should be sure to retain from his article: “FLOSS provides the freedom to run, study, adapt, improve, and redistribute software.” I certainly find this description of OSS (or FLOSS) freedom more useful than Gordon & West’s, who assert that the “free” in free software is like in “free kittens”. They do however present a tidy list of OSS offerings that they believe are ripe, or accessible to all libraries. This list includes Firefox add-ins such as Check4Change and Accessibar, OCLC Link Evaluator and Greasemonkey scripts. WordPress, Drupal and Ubuntu also get a nod.

Chudnov, knowing his audience, makes a point of reminding readers that using OSS software does not have to mean doing all the customization in-house. He lists examples of vendors who provide OSS software along with fee-based service. He also recommends understanding the concepts around OSS in order to gain more leverage when negotiating with vendors, even when it comes to proprietary software — i.e. adding a clause that requires the vendor to hand over the code should the product be discontinued.

Friday, March 14, 2008

The Autobiography of Ben Linus - Introduction

March 14, 2008
The Autobiography of Ben Linus:
by Bus


You may not know it but I am one of the richest and most powerful men in the world. I am responsible for some well known, but also less well known technologies---some for military use. I believe I am one of the “good guys,” but I am also a liar, thief, and murderer. I have most importantly dedicated my life to saving the most remnarkab;e place in the world--- MY Island. I am also saving the last of an ancient and special race of people who are now safe in their Temple. It was all worth it, and I am determined to continue to fight those who want to destroy my true family.

It started when my father got a job on a beautiful island in the pacific. But at first it was not beautiful to me. My father taunted me--- not only blaming me for the death of my mother while she gave me birth, but also simply that he hated me (and himself) his entire life. I ended that life. It never occurred to me that I could be so powerful, but on the island anything can happen. I once left our protected cottages at a young age. . . and I actually saw my dead mother. She appeared to me and spoke to me. Was it a spirit, a ghost? Or was it an energy left by her in my body that manifested itself in front of me. Was I just crazy? These questions would soon no longer concern me as I would have a new purpose in life. When my mother appeared is when I realized the island was special, and that gradually, I learned that perhaps I was special too. I had found beauty.

I was soon befriended by an ancient race of miraculous people that have been on the island for thousands of years. They were quite interested that someone new to the island could feel its special affects. They adopted me and became my real family. They even accepted me with all of my faults. Believe it or not, some of those ancient people did not seem to age. But I did. The Ancients, as I call them, would never suffer from disease, and if they did get hurt their wounds would heal tremendously fast. I later would find that I would not be the only newcomer to the island with special gifts or what I would call that “special connection” to the island. One man was a paraplegic and the instant he hit (literally hit) the island, he could walk. One could see the future. There was a boy who could be in more than one place at a time. So many fantastic stories.

I will explain in future chapters how I took part in a movement that killed hundreds of the members of the DHARMA initiative, a subsidiary of a multi-national anti-moral economic machine that continues to be my sworn enemy---This was run by Charles Witmore. There was a reason I kill and still kill. DHARMA’s evil scientists studied these strange natives with these great powers like RATS. When there was some resistance, DHARMA conspired to murder all of them because of their own paranoid fears. I would not let that happen. I will further explain how I used what I basically stole to become the billionaire I am today. I used and continue to use the influence of wealth to not only protect what I have, but also my true family on the island. Oh, I have my problems, and all families have problems. I am selfish and really like to get my way. I’m greedy, but I don’t want others to see it. I do seek things sexual, but only on my private trips to Thailand. And I really hate it when strangers get in the way.

Is it technology governing the islands or the spirits—God--faith? Is it natural forces governing the islands or the spirits—God--faith? Were these “effects” created by DHARMA themselves? My belief, and it’s just a belief, is that the nature of the island gave man a way to use parts of his mind he never thought possible. Shoot, maybe the island itself is alive? In any case it is quite amazing because even the mind or spirit or energy of the dead can be alive on that island. Am I the crazy one? No. I have known many natives with amazing and intensely diverse powers. A bit of it was given to me and I am truly thankful. Whether it was really my mother or not did not matter because she gave me something my father never could, to feel loved by a new family.

DHARMA studied these phenomena, but I took it away from them--- I took their technology, what they discovered and thus, my money stems from them. I made it so no one but myself or my friends could come and go from the island. I tried to make it so the Ancients would never be bothered again. I tried to create a utopian society working on saving mankind. I wanted to help the infertile have children, so many things I’ve done and tried to do, but alas, there seems to be another mystery with the island. It not only can hide, but in can GRAB things. Boats, helicopters, planes--- once caught up in these strange forces, they crash here. Some live through it.

I am determined to keep the Ancients alive, to keep DHARMA away from the island. I am willing to die for it--- and kill for it. I am one of the good guys, after all.

Chapter One . . .

Green IT: Waste - How to navigate the recycling maze

There are programs out there to help businesses and consumers get rid of old equipment responsibly

Special to The Globe and Mail

As the mountains of electronic landfill pile up, businesses of all sizes need to look carefully at how they deal with e-waste.

Even what seems like a clear way to go green may have hidden loopholes. For example, if a large business simply adds a clause to its request for quote saying the company that wins the bid must recycle the computers at the end of the lease, it can then boast it is keeping its PCs out of landfill sites. But that assumes the computers will be refurbished and resold, or sent to a recycling plant.

However, if the company leasing the computers is truly dedicated to being green, it would make sure the PCs are not simply put on a slow boat to China, where computer recycling allows toxic chemicals to seep into waterbeds and puts the health and safety of workers at risk.

And it gets more complicated for smaller businesses or even individual consumers.

All major PC vendors have asset disposal programs to help businesses and consumers recycle computers. However, many companies and consumers who buy computers are not getting them to recycling depots.

Environment Canada reports that Canadians bury or incinerate 158,000 tons of obsolete computer and electronic equipment every year. That is expected to triple by 2010 when Canada's homes and businesses will collectively produce more than 400,000 tons of e-waste. E-waste, including computers, TVs and other electronics products, is the fastest growing source of waste in North America and only 11 per cent of e-waste is recycled, according to a study conducted by the University of British Columbia.

While Canadian companies "lag behind Europe and the U.S." when it comes to recycling computers, they donate more used computers to schools and charities and sell more to their employees than companies in any other country, says Marc Perrella, IDC Canada Technology Group vice-president.

However, he admits that when it comes to recycling PCs, "everything is not well organized." Small companies are far less likely to use IT disposal services compared with medium and large companies, an IDC survey found. An estimated 11 per cent of companies with less than 100 employees intend to recycle computers, compared with 65 per cent of companies with 10,000 or more employees.

A large enterprise can dedicate staff to recycling or work with a vendor to ensure there is a recycling program in place.

However, small and medium businesses and consumers are generally left to their own devices. They have to clean hard drives, pack PCs and ship them or haul them to private or municipal recycling depots. For many, it is easier and less costly to put them out with the garbage.

Often, they cannot take used PCs back to retailers, the way beer drinkers can take empties back to beer stores. Even in Alberta, which has added a fee to the price of PCs to cover recycling costs, there is no financial incentive to recycle, such as the deposit beer drinkers get back when they return empties.

"Some retailers have recycling bins for electronics; others occasionally hold recycling days. ... But it's hit and miss.

"It costs the retailers to get rid of the equipment," says Michelle Warren, a senior analyst with Info-Tech Research Group. No laws force the manufacturers, resellers, retailers, consumers and businesses to work together, so the cycle often breaks down, she adds.

Still, many Canadian organizations are making the environment, including recycling, part of their corporate values, says Mr. Perrella. A recent IDC Canada survey of business executives revealed a significant interest "in stronger information and communications technology environmentally friendly policies and practices," he said.

If businesses want to be truly green, they have to ensure their used computers are being recycled "in an environmentally sound way," says Frances Edmonds, director of environmental programs for Hewlett-Packard (Canada) Co.

"There are economic incentives to do the wrong thing." For instance, electronics are often shipped to China because it's inexpensive to do so. Others are "cherry picked" for valuable parts, with the leftovers tossed into landfills.

To encourage businesses to recycle computers, HP will pick up any computer brand for a "cost recovery fee." Companies and consumers can arrange for pick up on HP's Planet Partners recycling website. The company has partnered with Sims Recycling Solutions, a division of Sims Group Ltd., to ensure computers are properly recycled. Hazardous material is safely removed and disposed of, metals are separated from plastics and reused and plastics are shredded and burned "in safe, controlled burns," replacing other fossil fuel and generating heat used in the recycling process, Ms. Edmonds says.

In July, 2007, HP announced that it had recycled one billion pounds of electronic products and supplies, diverting them from landfills, and it set a goal of recovering another billion pounds by the end of 2010.

"While many businesses are serious about proper IT asset disposal, there is considerable work to be done to help small companies understand the need for reliability and affordability of disposal methods," said Frank Fuser, director, services and customer experience, Dell Americas International. Dell's programs recovered 35.6 million kilograms of unwanted computer equipment for reuse or recycling from customers in 2006, a 93-per-cent increase over 2005. As a result of its global consumer and business awareness program, the company is ahead of schedule to achieve a multi-year goal of recovering 125 million kilograms of unwanted equipment by 2009.

Dell is the only computer manufacturer to offer consumers worldwide a no-charge recycling service for its own computer equipment without requiring new product purchases. The company was the first to set product recovery goals in 2004 and completed the rollout of its global recycling program in 2006.

The IT asset disposal sector is in the midst of a "major transformation," evolving from a relatively new sector to one with established processes, says Mr. Perrella.

To do your bit



Takes computer hardware and inkjet or laser cartridges for recycling for a fee

Has free retail drop-off points for rechargeable batteries

Has trade-in options



Consumers: no-charge recycling for Dell equipment without requiring new product purchases

Businesses: fee-based service for removing and recycling any used IT equipment

Donating: National Cristina Foundation helps disabled and disadvantaged children and adults



Free computer take-back and recycling with the purchase of a new Mac

$30 (U.S.) to ship a used computer or monitor to Apple's recycling partner

reBOOT Canada


Accepts drop-off donations or arranges for pick-up for a fee of computers and some electronic equipment at several locations across the country

Non-profit provides computer hardware and training to charities, non-profits and people with limited access to technology


Electronic Recycling Association (for western Canada):

More options:

More ideas at

Eye on the future

In November, 2007, IDC Canada did a survey for Hewlett-Packard Co. of 231 information technology and other executives at large and mid-sized Canadian organizations. When asked about their companies' environmentally-friendly IT approaches, the survey indicates the proportion of businesses:

Planning on doing it in 3 yearsPlanning on doing it in 5 years

Doing it now
Recycling services for hardware53%79%83%
Improved PC energy savings47%80%82%
Telework/remote office (reducing carbon footprint)32%55%61%
Energy efficient means of powering equipment28%61%68%
Smart energy efficient office systems23%54%65%

Sunday, March 09, 2008

How to: Create a Linux Box for Your Mom (50+ Resources)

By Jessica Hupp

For most computer literate children, a request from mom to get her set up on “this web thing” is met with panic and a feeling of drudgery. Are you about to expose your sweet mother to spam, phishing, viruses, or worse? Or perhaps more frightening, sign your life away as a 24/7 tech support center? Perhaps, but there’s a better way. By setting your mom up on a Linux machine, you can give her a safe, lean computing experience that will let her do all of the things she wants to do without giving you a nervous breakdown. Here, we’ve compiled over 50 of the best resources to help you get your mom on Linux without a whole lot of trouble.

Systems & Environments

With these systems and environments, you can get your mom set up with low maintenance and friendly interfaces.

  1. SimplyMEPIS: SimplyMEPIS is low-maintenance and great for Linux beginners.
  2. Linspire: Linspire is the “World’s Easiest Desktop Linux,” with a familiar look and feel for Windows users.
  3. Mandriva: Mandriva Linux was specifically designed to offer ease of use for new users.
  4. Ubuntu: One of the most popular Linux distributions, Ubuntu is stable and easy to use.
  5. KDE: The K Desktop Environment is easy to use, and offers basic desktop functions.
  6. Ximian Desktop: Ximian offers a simple layout, with large icons that are great for elderly users.
  7. Lycoris: This distribution looks a lot like windows, and offers great ease of use.
  8. SuSE: With SuSE, you’ll got lots of popular open source software like OpenOffice, Kaffeine, and more.
  9. GNOME: In this desktop environment, you’ll find an extremely usable GUI.

Tools & Applications

Put these tools to work to give your mom the functionality she wants while still keeping things safe and simple.

  1. Fluxbox: This X window manager makes it easy to customize the view of your mom’s machine.
  2. Rfbdrake: Set up rfbdrake to create a pathway for remote support.
  3. IEs4Linux: With this handy tool, you can make MSN groups and other Internet Explorer applications play properly for your game-addicted mom.
  4. IceWM: This window manager’s goal is to stay out of the user’s way while offering speed and simplicity.
  5. OpenAntiVirus: Although a Linux machine isn’t likely to run into virus problems, this antivirus program is better safe than sorry.
  6. Wine: Wine makes it easy to run Windows software and applications on your Linux box.
  7. Firestarter: For an easy, simple firewall, consider Firestarter.
  8. phpGACL: Keep your mom safe by implementing this access control list for applications.
  9. CrossOver Office: With CrossOver, you can run lots of Windows-based applications.
  10. CNR: This tool makes it easy for your mom to install applications, even if she’s clueless about putting things on her computer.
  11. KDE Crystal: KDE Crystal offers an icon set with recognizable images, which is great for remote support so you can tell your mom exactly what to press.
  12. Guarddog: Guarddog is an ideal firewall for novices because it offers a goal-oriented, non-technical GUI.
  13. Blackbox: Blackbox offers a clean, light environment for a Linux system.
  14. IPCop: Create a more secure home network with this simple firewall designed for novice users.
  15. vncserver: Utilize vncserver to run remote support on your mom’s Linux machine.
  16. OpenOffice: With OpenOffice, your mom will be able to do all of the word processing she wants.
  17. Evolution: This personal information manager offers email, addresses, tasks, and more in an interface much like Microsoft Outlook.
  18. KMail: Set your mom up on KMail for email with excellent spam filtering, cryptographic support, and more.
  19. Kate: With this lightweight editor, your mom can do simple word processing with automatic backup.
  20. Ekiga With Ekiga, formerly Gnomemeeting, your mom can video chat with you.
  21. MailWasher Pro: With this program, you can make sure that spam email will never hit your mom’s inbox.
  22. Abiword: Give your mom simple word processing with AbiWord.
  23. Kopete: Use Kopete to get your mom set up on chat programs like AIM, ICQ, and IRC.
  24. Adobe Reader: Put Adobe Reader for Linux on your mom’s computer so she can enjoy PDFs.
  25. Pidgin: Pidgin, formerly known as Gaim, makes it easy for your mom to log into a number of different messaging systems at once.
  26. Konqueror: With Konqueror, your mom can browse the web safely.
  27. Thunderbird: Use Thunderbird to offer your mom a clean email interface.
  28. Firefox: Get your mom set up on the wildly popular Firefox for safe and easy web browsing.
  29. GIMP: Give your mom GIMP for Photoshop functionality.
  30. No-Script: Use No-Script to make your mom’s Firefox browsing safe from harmful Javascript and Flash.

Guides & Articles

For even more help, check out these guides and articles that will walk you through creating a Linux box for your mom.

  1. Ubuntu for your grandmother: One helpful grandchild walks his grandmother through creating a Ubuntu laptop in this article.
  2. Post Installation Configuration Basic Help: Get help with basic hardware and network configuration here.
  3. Is Linux ready for mom?: This article discusses some of the trials and advantages of Linux for novice users.
  4. Windows to Linux: A Beginner’s Guide: Let your mom check out this article to get familiarized with Linux when coming from a Windows environment.
  5. Top 10 Ways to Protect Your Linux Home System: Follow this guide to keep your mom’s computer safe.
  6. Setting up Linux for Mom and Dad: See how one person set up a parent version of Mandrake Linux in this article.
  7. Desktop Adapted for Dad (DAD): This writer gave his father a computer with carefully installed and configured software.
  8. Moving a Beginner to Linux: Learn how to make the switch with this article.
  9. The Top 50 Proprietary Programs that Drive You Crazy-and Their Open Source Alternatives: In this resource, you’re sure to find lots of programs that will help your mom convert.
  10. Linux distro for mom?: In this thread, you’ll find lots of excellent advice for creating a Linux setup for a computer illiterate mom.
  11. Beginner’s Introduction to the KDE Desktop: This guide offers a look at KDE for non-techies.
  12. A Senior Citizen’s Introduction to Linux: See how one person set up a simple Linux system for an elderly woman in this article.
  13. 7 Reasons you should switch Grandma to Linux: This article touts security, stability, and more for Linux.

Latency - Return of the "L" word

Industry Commentary By Frank Dzubeck ,
Network World , 02/28/2008

What is the "L" word? It's latency.

For the first time in years, CIOs are becoming concerned that networking and inter-process latency is affecting IT performance.

The new IT initiatives of the 21st century are based on the business process transformation within a service-oriented architecture — Web services, Web 2.0, virtualization, federation and information management are creating new levels of IT performance requirements.

Add organizational and supply chain transformation through VoIP, video-based collaboration and innovative real-time, industry-specific applications and we have a major festering problem.

In the past, the solution has been the same — give IT more resources and it will solve the problem. Increase bandwidth if it is a networking issue, add memory or processors if it is a compute problem, or add faster/denser disk arrays if it is a storage problem. Expensive, but this technique has always transparently solved the problem.

From a networking perspective, the issue is becoming critical in the data center. Virtualization can create massive economic benefits, such as increased utilization of physical infrastructure, reduction in power consumption, increased productivity through ease of use and so on. What began with compute and storage virtualization has extended into applications, workflow, security and management software. Suddenly the need to monitor and manage latency is becoming evident.

The era of widely distributed data centers is over. Corporations are consolidating into fewer data centers. Within those data centers, consolidation of compute, storage and networking resources will be aggressive. The classic concepts of data center construction are being rethought and now revolve around a design philosophy similar to large Internet software services providers such as Google and Yahoo.

Simplify the infrastructure and software services around business/application-purpose components. Numerous names exist for this aggregation concept. IBM has coined the latest term for these IT entities, calling them "ensembles." Taken to the extreme, one can collapse within the corporate data center thousands of servers into a shared virtual multi- processor server and more than a tetrabyte of disk storage into collocated memory-based storage — all within a single IBM z10 super-server. Latency within this configuration is reduced to microseconds vs. milliseconds and resiliency is increased 1,000-fold.

Are we back to the era of the mainframe ? No — numerous data center infrastructure design philosophies will exist, dependent upon economics, business applications, and management and governance requirements of the corporation. Clusters of compute and storage resources will always need to be networked together as a platform for corporate applications and information. The type of networking and the amount of intelligence within the network will be become evident when application, virtualization and federation latency demands and corporate governance requirements are taken into consideration.

The entire IT industry has been in denial for years over the latency issue. One only has analyze the most recent data center announcements from Cisco, Juniper and even IBM and look at the marketing obfuscation to avoid using the "L" word. This seven-letter word scares the vendor as well as the customer.

Data centers cannot exist alone. They must connect to corporate users, corporate branch offices, sales channels, suppliers and other internal or external data centers. Add to that connections to grids, clouds and the Internet. The next major networking latency issue is the WAN. Again we enter a state of denial over the "L" word when we believe that WAN acceleration is only an economic issue. The fantastic growth in the WAN acceleration industry will not cease but will escalate with the ever greater use of XML, Web Services, Web 2.0, HD Video, real-time transactions and specific application intelligence.

The service provider should not be left out of this discussion. The "L" word is so heinous in that it almost never appears in a carrier’s vocabulary. All carriers primarily sell and focus their marketing on bandwidth. That bandwidth comes with varying degrees of service-level agreement (SLA) commitment. Within that SLA are usually loose guarantees of latency measured in such a manner as to confuse the customer and to give carrier support organizations maximum leeway to absolve themselves of a specific customer problem. The carriers are learning that as they create more complex services the "L" word is becoming an even greater issue.

Hidden in the closet for years, latency in its various forms is emerging as a subtle yet complex and critical component to both business, corporate and IT success. The IT industry, from the business process concept to the business application implementation, within the data center, across the corporation and its external food chain through the WAN/Internet, must accept the negative reality of the "L" word.

IT management must begin to establish robust latency SLA policies and learn to govern, monitor and manage latency as any other critical technical issue. Do not avoid the issue — next-generation, intelligent latency-sensitive SLAs are needed now, rather than in the future.


Google, Lenoir, NC, try to adpat to one another

Jen Aronoff
Sunday, February 3, 2008

Internet search giant also building near Goose Creek
The first phase of Google's data center in the North Carolina foothills city of Lenoir is nearing completion. The Internet search giant is is building a similar $600 million data center outside Charleston.

Jeff Willhelm/Charlotte Observer

The first phase of Google's data center in the North Carolina foothills city of Lenoir is nearing completion. The Internet search giant is is building a similar $600 million data center outside Charleston.

As Google builds a $600 million data center complex in the N.C. foothills city of Lenoir, high tech is meeting small town. The globally ambitious, California-based Internet giant is working to establish itself in a close-knit world that’s decidedly un-Silicon Valley, mixing with local civic groups and donating charity Christmas trees for a public display, amid strict secrecy the company says its project requires.

At the same time, a community still reeling from an exodus of manufacturing jobs, and accustomed to homegrown furniture companies’ deep local involvement and philanthropic largesse, is greeting the tightly veiled data center with a mix of excitement and intense curiosity.

Lenoir native Stephen Clay, 57, is so pleased about Google’s arrival that he hung a “Clay Insurance Welcomes Google” banner outside his business near downtown. He also attended a Google AdWords training seminar to learn about how its advertising works.

But he and others said they still wish they knew more about the center and how it will benefit the community. Visitors aren’t allowed on the construction site, which is ringed with barbed wire.

Residents who have tried to sneak a closer peek say they’ve been run off by security guards. And employees are limited in what they can say about the project’s specifics.

“People talk about it all the time,” said Anita Watters, 40, the assistant manager of Miller Hill Grocery, just up the street from the data center. “‘Area 51.’ It’s all this secretive stuff. They’re so hush-hush about what they’re doing over there ... I hear all kinds of (speculation).”

The growing demand for Internet functions and Web-based storage and applications has tech giants such as Google and Microsoft racing to build data centers — large, climate-controlled computer warehouses — capable of processing search requests and storing vast amounts of information.

The centers, also known as server farms, require ample land, power and water, and companies closely guard information that could provide clues as to how, and how efficiently, they operate, in an effort to gain competitive advantage.

The Lenoir project sparked criticism after it was announced last year, in part because it received state and local incentives valued at up to $165 million over 30 years.

As a result, Google has worked to improve its outreach. In April, it hired consultant Matt Dunne, a former Vermont state senator and gubernatorial candidate whose career has focused on bringing together entrepreneurship, community service and politics, to listen to residents and inform them about the company.

Yet he must also manage expectations and explain the competitive reasons data centers are built and operated in secrecy.

In Lenoir, Dunne said, he’s encountered a mix of hope and concern: Hope that Google will single-handedly transform the economy and worries that the company won’t hire any local workers; excitement about a second building phase and concern that Google employees won’t live in or near Lenoir.

Google is not going to be another Broyhill or Bernhardt — furniture companies that for decades were dominant and paternalistic employers in the region — nor is it moving its headquarters to town. Though large, the data center will employ about 200 people, not 8,000, Dunne said.

“There’s not going to be a Googleplex anywhere near Lenoir,” said Tom Jacobik, Google’s Carolinas operations manager. “We’re not here to save the town. We’re here to run a business.”

At the 220-acre Google site, past, present and future coexist. More than 300 construction workers, many from local contractors, have transformed a hill into more of a mountain, moving vast quantities of earth.The surrounding neighborhood, meanwhile, looks much as it did before, with a mix of modest homes and largely shuttered furniture factories. The Blue Ridge Mountains and snowcapped top of Grandfather Mountain loom in the distance.

The first data center has risen at the base of the hill, along N.C. 18, with large cooling units at the side. It’s on track to begin limited testing this spring and should be fully up and running by the end of the year, Dunne said.

Google is also excavating the pad for the second building, which is expected to begin operating in 2009, Dunne said.

Google declined to say how large the data centers will be, but permits on file with Caldwell County call for one $15.4 million, 139,797-square-foot building and another, $24.5 million, 337,008-square-foot building.

Those permits, incidentally, are not listed under Google, but under the name Lapis LLC. Ask to make a copy, and you’ll be told it needs to be cleared by a lawyer first.

When completed, the buildings will resemble large, clean warehouses filled with cabling, cooling pipes and racks of servers.

In addition to equipment, Google has also brought new people — and its own culture — to town.

The company’s Lenoir offices will look “good and Googly,” Dunne said, with beanbag chairs, lava lamps and a massage chair. A foosball table is already in use in the temporary trailers behind the first data center building.

Employees will also receive free food, incentives to buy hybrid cars and to buy homes in Lenoir, and either an onsite gym or subsidized gym membership.

The Lenoir staff is a mix of transfers and new hires skilled at hardware maintenance, the Linux operating system, and cooling and electrical work. Though Google declined to say how many permanent employees are now on the job, it expects to eventually employ about 210 people.

The company has hired qualified locals, though it declined to say how many — and has inspired others to improve their technology training, too.

“I just wanted to be a part of (Google), a part of the culture,” said Jennifer Crump, 35, of Morganton, a former stay-at-home mom who earned an associate’s degree in information technology and was hired earlier this month as a data center technician assistant. “It’s so different from what we have around here.”

Lenoir native Walter Brameld, 30, worked in an Atlanta data center but got burned out and moved home, figuring he’d have to take “a McJob.” Then he found out about Google. Once the site location became public, he’d drive past it to reassure himself it was really coming. He was hired in October.

Working at Google is a source of great interest in the community, which can be difficult because the company is serious about confidentiality, employees said. “Normally what I tell people is that I play pingpong and eat food all day,” Brameld joked.

As a big-name newcomer, Google will encounter heightened expectations and scrutiny in Lenoir, as would other companies expanding in new regions, said Bennet Zelner, an assistant professor of strategy at Duke University’s Fuqua School of Business.

Demonstrating a commitment to being part of the community, he noted, will eliminate barriers and help the company’s reputation.

In Lenoir, Google established an informational Web site about the data center and e-mailed area natives with an IT background who had moved away to see if they’d have interest in returning.

It worked with the local community college to shape the curriculum of its new Information Technology Institute, which trains people for entry-level data center jobs. It staffed a booth at a local business expo last fall that turned out to be the most popular in the building, said Alan Wood of the Caldwell Economic Development Commission.

Jacobik, 42, an Air Force veteran and father of seven who previously managed a data center for Oracle in Austin, Texas, oversees Google’s Lenoir operation and another planned outside of Charleston near Goose Creek. In and around Lenoir alone, he has addressed more than a dozen civic groups, including Rotary, Kiwanis and Ruritan clubs.

“If you rate our speakers on a one to 10 as far as how excited we got about it, it was probably a 10,” said Lorene Reece, who was president of the Happy Valley Ruritan club when Jacobik visited last year. “This was something we were really curious about, and he filled that need.”

At one meeting, Jacobik said, a man approached him and asked what a “query” was. He’s met people who haven’t used computers before, which never happened in Austin.

But then, he likes how Lenoir is different. People come up to him as he’s headed to breakfast, politely asking how things are going. He takes his kids to events downtown and visits small, locally owned businesses. Even the setting reminds him of his hometown in the foothills of upstate New York.

He knows hopes are high. And even if Google isn’t in a position to save Lenoir, he’s aiming for it to provide something else worthwhile. “We’re giving people a reason to be proud in their community again,” he said, “and it’s been great to see that.”

IBM Brings 'Microsoft-free' PCs to Europe

IBM teams with Austrian and Polish system integrators to supply Linux-based systems in Europe.
Matthew Broersma,
Friday, March 07, 2008 3:00 PM PST

In a move to challenge Microsoft on the desktop, IBM has teamed up with Austrian and Polish system integrators to supply the emerging Eastern European and Russian business PC markets with "Microsoft-free" systems based on Red Hat Linux and open standards-based productivity software.

Under the deal, announced this week, IBM will work with Vienna-based VDEL and LX Polska, based in Poland, to sell systems based on what the companies call "Open Referent". The systems will be based on Red Hat Enterprise Linux Desktop, Lotus Notes, Lotus Sametime and Lotus Symphony.

IBM sold off its own PC arm to Lenovo in 2005, and the company insisted it isn't getting back into the PC business.

Nevertheless, Open Referent taps into a growing demand for low-cost desktop systems that aren't tied to Microsoft standards, and poses a direct challenge to Microsoft.

IBM said it is responding to demand from large businesses and government agencies in Eastern Europe and Russia, including Aeroflot, the Russian Ministry of Defence and the RusHotel hotel chain, and said Open Referent could cut their costs in half.

IBM emphasized that government agencies in particular are interested in open standards, with many governments beginning to require formats such as ODF or PDF for official documents.

"This is important because it's a secure and cost-effective Microsoft desktop alternative," said Kevin Cavanaugh, IBM vice president for Lotus Software, in a statement.

Lotus Notes will be provided for groupware, with Lotus Sametime for unified communications. Lotus Symphony is based on and uses the ISO-approved ODF family of open document formats.

The integrators are likely to use white-box PCs, with manufacturers varying from country to country, according to IBM.

Large organizations in the U.K. have been slow to adopt open source on the desktop, even more so than other Western European countries, according to open formats advocacy group OpenForum Europe. Its chief technology officer, Mike Banahan, once remarked that the U.K. is a "third world country" compared with the rest of Europe where it comes to public- and private-sector interest in open source.

OpenForum's research found that open source is blocked by many factors, including the fact that companies are often highly uncomfortable with the available technical support options.

Q Associates, a U.K. IBM reseller, told Techworld it does significant business in Linux servers but has seen minimal interest in open source desktops.

Desktop Linux integration isn't necessarily a straightforward process for a large organization, despite any cost benefits. The City of Munich, for instance, has spent several years tweaking a 14,000-desktop installation of Linux on the desktop. The effort has involved developing new software such as the GOsa network administration tool to assist in the management of the desktops.

Tuesday, March 04, 2008

Data centres - Cool it!

Data centres - Cool it!

Mar 4th 2008

The data centres that power the internet demand a lot of power. Time, then, to make them more efficient


AS ONE industry falls, another rises. The banks of the Columbia River in Oregon used to be lined with aluminium smelters. Now they are starting to house what might, for want of a better phrase, be called data smelters. The largest has been installed by Google in a city called The Dalles. Microsoft and Yahoo! are not far behind. Google's plant consumes as much power as a town of 200,000 people. And that is why it is there in the first place. The cheap hydroelectricity provided by the Columbia River, which once split apart aluminium oxide in order to supply the world with soft-drinks cans and milk-bottle tops, is now being used to shuffle and store masses of information. Computing is an energy-intensive industry. And the world's biggest internet companies are huge energy consumers—so big that they are contemplating some serious re-engineering in order to curb their demand.

The traditional way of building data centres such as Google's is to link clusters of off-the-shelf server computers together in racks. Hundreds, even thousands, of such servers can be combined to achieve the sort of arithmetical horsepower more usually associated with a supercomputer. But the servers all require energy, of course, and so do the electronic links that enable them to work together. On top of that, once the energy has been used it emerges as heat. The advanced cooling systems required to get rid of this heat demand the consumption of more power still.

All of which is expensive. Though the price of computer hardware continues to plunge, the price of energy has been increasing. The result is that the lifetime cost of running a server now greatly outstrips the cost of buying it. A number of researchers are therefore looking for ways to operate big computer centres like the one at The Dalles more efficiently.

Horst Simon, a computer scientist at the Lawrence Berkeley National Laboratory in California, is working on something called the “climate computer”. The servers in data centres contain what are known as multicore processors. These combine a small number of powerful chips to do the calculations. Often these processors are more powerful than is strictly necessary for the sorts of jobs that data centres do. The climate computer would use arrays of less powerful, and hence less power-hungry, processors. The difference is staggering. Dr Simon and his colleagues think they can build a system that consumes a hundredth of the power of an existing data centre without too much loss of computational oomph.

Jonathan Appavoo, Volkmar Uhlig and Amos Waterland, three researchers at IBM, are taking a different approach. They have been tinkering with the architecture of the company's Blue Gene supercomputers. Blue Genes already use chips that consume much less power than do the microprocessors typically employed in data centres. They also use a specialised, energy-efficient communications system to handle the traffic between machines. This bespoke approach, IBM reckons, adds up to a machine that demands less energy per task, is more reliable, and occupies a lot less space than clusters bolted together from mass-produced servers.

Blue Gene computers are designed to excel at specialised, demanding tasks (they got their name because they were originally tested on some knotty problems in genomics). However, Dr Appavoo, Dr Uhlig and Dr Waterland think that, with a little tinkering, a network of Blue Genes could easily handle the software used by internet firms. Project Kittyhawk, as it is called, would link several thousand Blue Gene computers together to form a system with more than 60m processing cores and room to store 32 petabyes of data. A petabyte is a million gigabytes—and Kittyhawk, if it were built, would in theory be able to host the entire internet by itself without breaking sweat. What that would do to Oregon's economy remains to be seen.

Robert Plant says no to Led Zeppelin world tour and £100m

Plant says No to a world tour.. and £100M

Robert Plant (SM)

Robert Plant (SM)

Robert Plant has turned down an extra £100million fortune to take Led Zeppelin on a world tour.

The rock legend wants to concentrate on his new partnership with US country singer Alison Krauss... spelling the likely end of the famous band.

Led Zep fans had been longing for a tour announcement since last year - when the 1970s superstars reformed for a one-off show at London's O2 Arena.

Surviving members Plant, Robert Page and John Paul Jones were offered a guaranteed £100m each for a tour of North America and Europe after a million fans applied for O2 tickets.

But after extensive talks Led Zep - best known for the song Stairway To Heaven - decided against the moneymaking opportunity.

Guitarist Jimmy Page, 64, was keen to do the tour this year and was backed by bass player John Paul, 62.

But singer Plant, 59 - already worth an estimated £70million - wants to concentrate on his new success with US country singer Krauss. The pair's duet album Raising Sand went to No 2 in America and Britain and they are starting their own tour in April.

A band source said: "Despite the enormous offer, the decision did not come down to money. They always said they would do the one-off show and then see how they felt.

"Jimmy had enjoyed the concert in December enough to want to tour. He argued they still had something to offer. He likes the idea of another chapter in the band - the grown-up tour.

"John sided with Jimmy. He loved making music with the others again.

"But Robert wanted to leave last year's concert as their legacy. They had proved they could still do it and that was enough.

"He has other commitments and is happier looking forward to those. Robert put the mockers on the tour.

website page counter