Data Centers are becoming mass consumers of Energy. Oddly enough one of the largest sources of CO2 is badly written software.
If your running active pages for every web hit in stead of static pages with client scripts, then your consuming 10x or more the CPU power needed then the alternate software approach. With this you will then required 10x the computing resources. But this is only have of the iceberg. For every Watt consumed a Watt is lost in the power supply. For every Watt spent in the computer box, an additional Watt is spent cooling those systems.
So a 100 Watt CPU at full tilt load (which is where you want to run it for highest power efficiency, although consumption is also at it's peak) you spend 100 watts in the power supply and fans etc and another 200 watts on the air conditioner.
Now the argument to use scripted languages and these higher level coding languages is to cut the development time and cost down. This come at the price of higher cost in servers, application management and power consumption. These are a perpetual part of a companies burn rate and ultimately become a limiting factor for some companies.
Cooling is 1/2 the cost of running a data center, if your power source gave you that for free basically then you will save money.
I am posting this link because it's insightful. This gives you some idea why Bloom Energy is valuable, since Microturbines are a direct competitor for micropower generation systems.
The University of Toledo fires up microturbines in the data center
http://www.itworld.com/data-centerservers/224865/university-toledo-fires-microturbines-data-center
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment