A technology blog for The Economist Group IT team
Monday, June 26, 2006
Why cool is a hot topic for IT infrastructure
So, Bill Gates is "transitioning" out of Microsoft. He admitted long ago that he couldn't be as hands on as he used to be - with products like Vista and its 50 million lines of code that's understandable. Apparantly Sergey Brin and Larry Page are trying to do things differently and continue to do what they think that they do best - that is, come up with new ways of solving problems. One of the things that they are said to be working on is how to reduce Google's biggest cost - power.
Gartner reckon that the cost of electricity for a server over its lifetime will approach its purchase price as hardware costs come down and electricity costs go up. A typical dual processor server consumes around 1kW without a monitor but including cooling (a lot of that power gets converted to heat). With electricity costs of around £40/MWh for large organisations, this puts the cost per server at around £350 per year.
When you consider that the likes of Google who use relatively cheap hardware because they have massive redundancy, you start to see the problem. Brin and Page have looked at whether they could manufacture their own units without graphics processors, for example, as they don't need their boxes to drive displays. It also seems likely that they'd take a real close look at the cost of electricity when deciding where to build new data centres.
An article on Byte.com describes an interesting initiative at a Sun facility to test whether direct current can be supplied direct to equipment and thus save on power loss from conversion from AC to DC.
Gartner reckon that the cost of electricity for a server over its lifetime will approach its purchase price as hardware costs come down and electricity costs go up. A typical dual processor server consumes around 1kW without a monitor but including cooling (a lot of that power gets converted to heat). With electricity costs of around £40/MWh for large organisations, this puts the cost per server at around £350 per year.
When you consider that the likes of Google who use relatively cheap hardware because they have massive redundancy, you start to see the problem. Brin and Page have looked at whether they could manufacture their own units without graphics processors, for example, as they don't need their boxes to drive displays. It also seems likely that they'd take a real close look at the cost of electricity when deciding where to build new data centres.
An article on Byte.com describes an interesting initiative at a Sun facility to test whether direct current can be supplied direct to equipment and thus save on power loss from conversion from AC to DC.
Comments:
Post a Comment