Tech – Hub Just another WordPress site Tue, 24 Jan 2017 02:32:21 +0000 en-US hourly 1 Facebook Cools Off Sun, 24 Nov 2013 17:02:31 +0000 Waste looks like the pile of plastic, Styrofoam, and cardboard that your tiny new earphones came in. It looks like plastic bottles overflowing in the waste bin beside a water fountain. It looks like a Hummer occupied by a single person. But few people think about waste when they post kitty photos on Facebook. The clusters of computers at data centers that power websites live far away from the people who use them. As a result, the energy they consume remains out of sight and out of mind.

Until recently, the executives of most companies served by data centers did not put much thought into energy either. High electricity bills were taken as a sign of thriving business. And by that measure, business is great. Facebook users upload 350 million photos, send more than 10 billion messages, and click nearly 5 billion likes every day. All that activity requires enough electricity to power a city of 155,000 people. Google could power a metropolis of 750,000, larger than Boston, Seattle, or Denver. Globally, power consumption at data centers rose by 63 percent last year, to 38 gigawatt-hours—an amount equivalent to the annual electricity usage of New Zealand or Colombia.1

But the costs of processing data have now become so high that an efficiency edge can turn into a significant competitive edge, leading many high-tech firms to start relatively low-tech development programs focused on one thing: leaner data centers.

The goal is not an easy one. A New York Times investigation found that only 6 to 12 percent of the electricity used to power servers went to computations. Much of the rest went to keep servers idling in case a rush of activity would arrive, and to back up systems and air conditioning. Even momentary service interruptions can be expensive. For example, Sears stated that it lost $1.6 million after electrical problems shut its data center for five hours during the post-holiday rush on Jan. 3, 2012. A mere five-minute outage at Google in 2013 cost the company $545,000. And an Amazon outage in the same year, $1,100 in sales per second, according to Buzzfeed.

With so much to lose, the price of excess energy seems worth it. Operators add redundant fleets of servers to their data centers, which chew up power but do nothing, waiting to take over in case a server fails. This is about as efficient as running the air conditioner full blast while on vacation. But in the data center world, duplication is not waste, it is reliability. “The risk of something going down is not worth the savings,” says Dennis Symanski, a senior project manager at the power utilities think tank, Electric Power Research Institute (EPRI). “They want their facilities running 99.995 percent of the time.”

There is waste, too, in cooling and maintenance services. Thousands of computers running in a single room cause temperatures to rise and computers to fail. To prevent this, giant air conditioners push cold air through floor vents and directly through the servers themselves, where it is heated and passed out as hot air.

All of this means only a fraction of energy goes to the computing services which power the websites people actually use. A commonly used measure of this fraction is the power utilization efficiency, or PUE. A PUE of 1.0 means that all of a data center’s energy is going to in-use computing equipment; a 2.0 would mean that only half of its energy is. When the Uptime Institute, a New York-based data center industry group, first polled members in 2007, it found the average PUE was 2.5. In other words, members burned 1.5 kilowatts-hours of electricity for every 1 kilowatt-hour they used for computing.

With so much to lose, the price of excess energy seems worth its cost if it ensures that a blackout won’t occur.

Energy expenditures finally got under Walt Otis’ skin two years ago. Otis manages the data center of a professional services firm with offices around the world. Business was up, and Otis wanted to expand, but his 1,500 servers had pushed his cooling system to the max. Adding more would have overloaded it entirely.

This led Otis to look at his data center in a different way. For example, he noticed that the air conditioning vents were positioned just above the areas where computers vented out hot air, heating the air-conditioned air before it could cool the servers. The problem was simple enough to fix. He moved the air conditioning vents away from the computer exhaust, and hung plastic drapes over the servers, and at the ends of aisles, to keep chilled air from mixing with warm air.

With this simple change, Otis was able to dial up his thermostats by 7 degrees Fahrenheit. He did it one evening when no one was around. “The techs would have freaked out if I even talked about it because they were afraid the servers would heat up and fail,” says Otis, “but nobody even noticed because the temperature was still cooler than the heated air that was reaching the servers before.” The move cut Otis’ electric bill by nearly half, and he expanded his data center.

At Facebook, technicians also isolated server exhaust from cooler air. They also made a more radical change: They threw away their air conditioners. The lore among technicians was that servers must operate between 64 to 70 degrees Fahrenheit to prevent overheating. It took years to realize that this was a holdover from earlier times. Newer servers can run as hot as 95 degrees Fahrenheit, says Jay Kyathsandra, Intel’s manager of data center efficiency solutions. That means that blowing in ambient outdoor air can replace air conditioning.

Lawrence Berkeley National Lab’s supercomputer center in California has made this change, but relies on water instead of outside air. It starts by letting warm water cascade through large towers, where some of the water evaporates and lowers the temperature of the remaining water. The water is then used to cool the hot air in the equipment room.

The idea can also be run in reverse: At the National Renewable Energy Laboratory’s data center in Boulder, Colo., warm air wafting off the servers is funneled into adjacent buildings to heat rooms, and also onto nearby sidewalks and parking lots to melt snow in the winter.

Modern data centers are also reducing server redundancy. Rather than have each server handle a single operation, like streaming video or email, today’s servers can act as multiple independent servers each handling a different operation. “We can use the same piece of hardware to run 5, 10, or 15 applications instead of one [application],” says John Pflueger, Dell’s representative to Green Grid, an organization founded to improve data center efficiency.

These measures are having an effect. In 2012, the Uptime Institute found that average PUEs had fallen significantly from 2.5 to below 1.9. But the largest data centers have shown some of the most dramatic improvements. Facebook, for example, had an average PUE of 1.09 in 2012. Google’s most efficient server farm checked in at 1.06. In the data center world, energy savings have become as visible as a garbage can brimming with trash, and companies are cleaning up their act.

Originally posted on Nautilus.

Photo: epSos .de.

]]> 0
Tamagotchis Are Coming Back From the Dead Sun, 24 Nov 2013 16:47:57 +0000 Remember Tamagotchis? Of course you do. They taught you about responsibility. They taught you about friendship. And perhaps most importantly, they taught you that friends don’t leave friends in feces-filled rooms for days at a time—because then they will die. But now, it’s time to impart that wisdom on a new generation. Rejoice, friends, for the Tamagotchi is back.

Japanese toymaker Bandai (the same company that gave the world the original) is relaunching the beloved 90s digital pet as Tamagotchi Friends. And though it may look remarkably similar to the needy little pocket friend you once loved so dearly, the Tamagotchi of the 21st century is getting a few key updates to keep the kids these days satisfied. Perhaps the most notable change, though, comes in the fact that you can actually choose one of several different “characters” to raise as your own. This is, of course, in stark contrast to the single, featureless blob we were forced to love once upon a 1996 (RIP dear blob, we hardly knew ye).

And considering it’s now common practice to exit the womb already clutching a smartphone in each hand, it should come as no surprise that the new Tamagotchi Friends comes fully loaded with “short range communication.” This lets you and your (human) friends bump your Tamagotchis together so they can interact and play—you know, like people used to do.

There’s still a while to wait, though; Tamagotchi Friends won’t be hitting US soil (at $20 a pop) until fall of next year. Of course, you can always keep your god complex satisfied with the app version until then. Because unlike your human friends, your Tamagotchi will always come back to love you again—poop death be damned.

Originally posted on Gizmodo.

Photo: Lauren Garza.

]]> 0