"Real" Grid Computing
My brother is consultant for increasing the energy efficency of buildings. Not long ago i had an interesting discussion with him: In IT we use the example of the electricity grid and a central power generating plants as the future of a utility based computing. The problem: My brother told me, that this is a model of the past. There is a massive trend in the generation of electricity away from the big central plant, the future is the small decentralized power plant to generate the electric power near the consumers of power to get rid of the transport losses (those losses accumulate to an primary energy factor of 1:3. For each consumed kilwatt, you have to put 3 into the power grid). There are even thought games about virtual power plants by generating electical power directly at the households. All this micro plants are operated and controlled by a central control. When you need more energy in a region, the microplants in the region start to produce more energy to feed the energy in the regional grid. Perhaps this is the way to go in utility computing. The systems are at the consumer premises, operated and managed by a central . When other customers has a surge in it´s need for more compute power , the compute power is generated by the micro-datacenters at the others sites, for example at sites, where the customer can´t load it´s compute system (a good argument for cryptography everywhere). In this case you have a fast and low-latency access to your compute power at normal operation but access to a vast amount addtional capacity. Smaller customers in a region could even start without own machines and when they demand more than a certain amount of capacity, and own micro compute plant will be installed at this site.