Thursday, December 31, 2009

Who do you trust to meter the Cloud?

Tom Raftery at Greenmonk (the green shoot from Redmonk) has a great analysis of the disastrous use of smart meters by PG&E in Bakersfield, California.

He quotes SmartMeters.com that:
Bakersfield residents believe their new smart meters are malfunctioning because their bills are much higher than before. PG&E claims higher bills are due to rate hikes, an unusually warm summer, and customers not shifting demand to off-peak times when rates are lower.
http://www.smartmeters.com/the-news/682-lawsuit-filed-against-pgae-for-smart-meter-overcharges.html

In the same story on smartmeters.com, State Senator Dean Florez, the Majority Leader in California, is quoted as saying “People think these meters are fraud meters. They feel they’re being defrauded. They’re getting no benefit from these things.”

This after $2.2b (yes, billion) was spent on the project.

Tom Raftery goes on to say:
One of the advantages of a smart grid is that the two way flow of information will allow utilities to alert customers to real-time electricity pricing via an in-home display. PG&E have not rolled out in-home displays with their smart meters, presumably for cost reasons. If they lose the class-action law suit, that may turn out to have been an unwise decision.
http://greenmonk.net/pge-smart-meter-communication-failure/
There is a better way, however:
What PG&E should have is a system where customers can see their electrical consumption in real-time (on their phone, on their computer, on their in-home display, etc.) but also, in the same way that credit card companies contact me if purchasing goes out of my normal pattern, PG&E should have a system in place to contact customers whose bills are going seriously out of kilter. Preferably a system which alerts people in realtime if they are consuming too much electricity when the price is high, through their in-home display, via sms,Twitter DM, whatever.
http://greenmonk.net/pge-smart-meter-communication-failure/
So what has this got to do with Cloud Computing? Quite a lot, actually. Customers of Cloud services right now depend on the "meters" being provided by the service providers themselves. Just like the PG&E customers in Bakersfield. This means that they depend on the service provider itself to tell them about usage and pricing. There isn't an independent audit trail of usage. The meter also locks the customer into the service provider.

A Cloud Service Broker addresses these issues. It is not a coincidence that much Cloud Service Broker terminology carries over from the world of utilities - it is solving the same problem:
Data transfer to cloud computing environments must be controlled, to avoid unwarranted usage levels and unanticipated bills from over usage of cloud services. By providing local metering of cloud services' usage, local control is applied to cloud computing by internal IT and finance teams.
http://www.vordel.com/solutions/cloud.html

The Cloud Service Broker analyzes traffic and provides reports as well as an audit trail. Reports include usage information in real-time, per hour, per day, and per service. Reports are based on messages and based on data. Visibility is key. This is all independent of an individual Cloud service provider. It is easy to imagine how useful this would be in conjunction with Amazon's spot pricing (see a great analysis of Amazon's spot pricing by James Urquhart here).

The lesson from the Bakersfield debacle is that customers of services, whether utilities or Cloud services, need real-time visibility of their usage, real-time visibility of costs, as well as an independent audit trail. In the Cloud world, this is provided by a Cloud Service Broker.

Wednesday, December 30, 2009

What is a Security Token Service and what does it do?

The term Security Token Service is often bandied around, but clear examples of an STS in action tend to be lacking. Here is a video I've put together of an STS in action, including examples of the WS-Trust RequestSecurityToken / RequestSecurityTokenResponse messages.

The video shows the usage of an STS in conjunction with an XML Gateway (in fact the Vordel XML Gateway includes an STS built-in):



It also shows how SOAPbox can be used to call an STS using the RST/RSTR messages:



And we see the SAML assertions, returned from the STS, embedded into SOAP messages:



Check the video out for yourself at:
http://www.vordel.com/research/Security_Token_Service.html

Saturday, December 19, 2009

Information Security... what?

I've heard of information security policies, information security professionals, and information security conferences. But "Information Security Restrooms"?

I was working out of the Vordel Herndon offices for a few days this week, swinging back to the East Coast from California. I spotted this sign in Reston, Virginia, while walking in bitter cold past a skating rink to the Vordel Holiday Party. It certainly caused me to do a double-take:

Thursday, December 17, 2009

How to create a SAML Assertion

Many applications, including ESBs and Application Servers from Oracle and Sun, consume SAML assertions. Testing these applications can be a chore, since they require using a toolkit or API to create a SAML assertion. A good alternative is to use the free Vordel SOAPbox product includes the ability to create a SAML Assertion to be placed into an XML message, just using point-and-click configuration. Under the "security" menu item you can see the "Insert SAML Token" option:



You configure the SAML options graphically, no coding required:



This results in a SAML Assertion being inserted into the message, as shown below in the "Design" view of SOAPbox:



Grab your free copy of SOAPbox at: http://www.vordel.com/products/soapbox/

Friday, December 11, 2009

What Google thinks SOA is

A view into the Hive Mind - what you see when you type "SOA is" into Google:

Tuesday, December 8, 2009

Randy Heffner from Forrester on Policy-based SOA

Randy Heffner from Forrester has posted on ZDNet about how "Policy-based SOA will enable increased business value and agility". He does a great job of explaining how a Policy-based SOA affects different users.

Firstly, there is the person designing the policy. As Randy says, the policy is defined "using the SOA product’s administration tool" (ie. not by writing code), and he goes on to say that "the important point here is that the policy is declared separately from the service, allowing it to change without changing the service itself". So, the policy is designed (as opposed to not coded) and then applied to services. This is preferable to burying policy details in with business logic, because, as Randy says, "If the policy is buried in the service implementation, the only definitive way to determine the active policy is to look at the code".

Then we are on to the person charged with enforcing the policy. They must use a product which does not slow down service execution as a side-effect of applying policy. Given that much policy-based SOA processing boils down to XML, XML Acceleration is required here.

Then, there is the person monitoring the SOA-based policies. The important point here is that the policies must map up to business-level insight. So that, for example, if a group of inventory-related services increase in use, this has implications for inventory usage. Randy Heffner puts it as:
Monitoring may include business-level insight. Besides technical operations data, SOA products can extract business data from service requests and responses, thereby enabling business-level monitoring.
So: We have design (policy design, independent from services), enforcement (with acceleration), and monitoring. I like to associate this with the Dilbert characters - Dilbert, Dogbert, and the Pointy-Headed Boss:


Finally, Randy recommends that policies do not become silos. Centralized policy management must be used (for example HP's GIF framework which Vordel supports).

Maureen O'Gara in Cloud Computing Journal - Microsoft Cloud Patent Application Not What It Seems

Maureen O'Gara provides further coverage of Microsoft's Cloud Migration patent application over in Cloud Computing Journal

Thursday, December 3, 2009

More on Microsoft's "Migrating Data To New Cloud" patent application

More commentary on Microsoft's "Migrating Data to New Cloud" patent application:

- Loraine Lawson in IT Business Edge

- Gavin Clarke in The Register

And, of course, the patent application itself:

Tuesday, December 1, 2009

Cloud Computing in Practice

James Urquhart has assembled a very impressive list of examples of Cloud Computing in practice.

Examples include:
  • Number of applications running on Force.com: 135,000
  • Number of applications hosted by Ruby on Rails platform service vendor Heroku: 40,000+
  • Objects stored in Amazon Web Services S3: 64 billion (as of August 2009)
Full details at: http://news.cnet.com/8301-19413_3-10405895-240.html

Microsoft's Cloud Migration Patent Application

Today in InformationWeek, Alexander Wolfe speculates about Microsoft's patent application regarding data migration between cloud services. Although on the face of it, a patent for Cloud migration would appear to be aimed at removing the lock-in associated with a single vendor, the patent application is in fact aimed within a single vendor system. So, it doesn't address the Cloud lock-in problem which has been identified by ENISA as the #1 risk of cloud computing. Lock-in to a single vendor can be addressed using a Cloud Service Broker solution which mitigates against the use of a single Cloud service by brokering the connection up to the Cloud service, allowing a switch-over at the interface level to a back-up Cloud service in the event of service failure.