He quotes SmartMeters.com that:
Bakersfield residents believe their new smart meters are malfunctioning because their bills are much higher than before. PG&E claims higher bills are due to rate hikes, an unusually warm summer, and customers not shifting demand to off-peak times when rates are lower.In the same story on smartmeters.com, State Senator Dean Florez, the Majority Leader in California, is quoted as saying “People think these meters are fraud meters. They feel they’re being defrauded. They’re getting no benefit from these things.”
This after $2.2b (yes, billion) was spent on the project.
Tom Raftery goes on to say:
One of the advantages of a smart grid is that the two way flow of information will allow utilities to alert customers to real-time electricity pricing via an in-home display. PG&E have not rolled out in-home displays with their smart meters, presumably for cost reasons. If they lose the class-action law suit, that may turn out to have been an unwise decision.There is a better way, however:
What PG&E should have is a system where customers can see their electrical consumption in real-time (on their phone, on their computer, on their in-home display, etc.) but also, in the same way that credit card companies contact me if purchasing goes out of my normal pattern, PG&E should have a system in place to contact customers whose bills are going seriously out of kilter. Preferably a system which alerts people in realtime if they are consuming too much electricity when the price is high, through their in-home display, via sms,Twitter DM, whatever.So what has this got to do with Cloud Computing? Quite a lot, actually. Customers of Cloud services right now depend on the "meters" being provided by the service providers themselves. Just like the PG&E customers in Bakersfield. This means that they depend on the service provider itself to tell them about usage and pricing. There isn't an independent audit trail of usage. The meter also locks the customer into the service provider.
A Cloud Service Broker addresses these issues. It is not a coincidence that much Cloud Service Broker terminology carries over from the world of utilities - it is solving the same problem:
Data transfer to cloud computing environments must be controlled, to avoid unwarranted usage levels and unanticipated bills from over usage of cloud services. By providing local metering of cloud services' usage, local control is applied to cloud computing by internal IT and finance teams.The Cloud Service Broker analyzes traffic and provides reports as well as an audit trail. Reports include usage information in real-time, per hour, per day, and per service. Reports are based on messages and based on data. Visibility is key. This is all independent of an individual Cloud service provider. It is easy to imagine how useful this would be in conjunction with Amazon's spot pricing (see a great analysis of Amazon's spot pricing by James Urquhart here).
The lesson from the Bakersfield debacle is that customers of services, whether utilities or Cloud services, need real-time visibility of their usage, real-time visibility of costs, as well as an independent audit trail. In the Cloud world, this is provided by a Cloud Service Broker.