Does Data Want to be Free?
After a quick summary of the CloudU report we recently published that looked at this topic, we launched into a free ranging discussion around the issues relating to open standards. We didn’t shy away from the contentious questions, but rather had a robust discussion that transcended individual interests. It was really interesting to reflect on the balance between innovation and customer security. Many people contend that standards, in and area, tend to reduce innovation in that particular area. Given that both the panelists are strong proponents of an open approach, I wanted to play devils advocate and question what impacts we could expect to see in terms of reduced innovation through standards. Both Johnston and Sanchez pointed out that widespread cloud adoption isn’t a long term prospect, rather it is something that is happening now, and will accelerate over the next one to two years – as such we’re imminently facing widespread use of cloud by non-technical users and hence the protection that Open Standards will drive is of critical importance.
Johnstone went into the drivers and focus of the Open Cloud
Initiative, pointing out that it is a continuation of a thrust that he’s
personally been pushing for a few years now. When asked why we need
Open Standards, Johnstone was pragmatic. The bigger question that he’s
struggling with is a common definition of the word “Open.” The OCI has
defined it as being;
- - Fully documented with copyright available
- - No patents requiring royalties on the specification
- - The trademarks have to be available for reuse
Johnstone believes that is an appropriate level of openness but points out that others will have differing views on where the test of openness should lie.
In terms of lock in, and the suggestion that companies should potentially use one of the pre-existing cloud ecosystems already in existence without worrying about the openness or otherwise of said platform. Sanchez still holds concerns for this approach, pointing out that if you build for one of the proprietary platforms, you’re building to their tight specifications with no guarantee of ongoing data access or portability. It’s akin to buying a black box appliance and placing it in your data center, it assumes the organization accepts the predetermined and defined proprietary specifications of a particular vendor. Johnstone pointed out that in fact it’s even more risky than simply buying an appliance due to the fact that with outsourced technology, there is no guarantee that the vendor will continue service – he gave the example of recent price hikes on Google App Engine and the difficulties that causes for those who are already committed to that platform.
Click below to tune into the full webinar. As always we’re happy to hear your feedback and thoughts.
(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)