"The Cloud Concierge" - The New CIO – Creating IT as a Platform
Abstract: Faced with many cloud computing options (public, private, hybrid) and the threat of “shadow IT”, tomorrow’s CIO will need to evolve corporate IT into an operational model that is less about saying “no” and more about enabling the cross-cloud capabilities required for 21st century business. This session will discuss the technology and operational transitions needed by CIOs in every industry to accommodate the trade-offs between device proliferation at the edges and operational efficiencies for their 21st century bit factories. Attendees will have a blueprint to evolve IT from an organization into a platform for delivering better IT.
Before I get into some of the specifics, I believe that it's valuable to identify some high-level concepts and trends that are driving technology today:
- Public Cloud Computing
- Mobile Computing - Computing “in your pocket”
- Consumerization of Devices (prices, app-stores, usability)
- Connected Applications (Web 2.0)
- Big Data Analytics and Analysis
- Move Applications to the Network (NAPs for Connectivity)
- Move Data to the Computer (Big Data)
So what does this "Cloud Concierge" really mean? Am I trying to grab headlines and predict the demise of CIOs and IT organizations? No, not really. But I do believe these trends, preclude the beginning of a significant shift in the way that role and those organizations provide value and impact their business.
Let me start with a few basic statements:
- No business today that expects to be in business in 2020 (or 2050) is going to get there without digital technology playing a centralized role.
- Technology is becoming more and more familiar to people of all ages and across all types of skill sets. So it's no longer unusual to have a traditionally "non-IT" person know quite a bit about a technology subject.
- Even though more people know at least something about technology, most of them don't want to be in IT.
- More external companies (or "services") are filling niches that IT may not have excelled at in the past, or are providing services at speed/pricing levels unseen in IT before.
So what's a CIO to do?
I would argue that their role needs to shift from a "controlling function" to a "connecting function". Just as a hotel concierge can provide functions as simple as directions and as complex as food/music/sightseeing recommendations based on detailed preferences, the 21st century CIO needs to evolve to one that connects the business to IT functionality, delivered from anywhere.
Ultimately the new CIO needs to be thinking of creating "IT as a Platform" instead of "IT as a Cost Center" or "IT as THE ONLY Services Source". By thinking about the IT function as a platform, the CIO can begin truly about the model of connecting business to services, just as open Internet platforms do through APIs today.
So what are the potential steps to start making this transition?
- Be a Market Maker - Not only does a CIO need to understand what it costs to deliver an IT service, but they need to understand how the industry leaders are able to deliver services. In many cases, there is operational knowledge to be gained from Public Cloud providers. Once this is understood, analysis can be done against internal "markets" (existing IT services) to see where changes or enhancements are possible.
- Don't Define Lines, Create New Skills - Too often, the discussion about Cloud Computing is focused only on Public vs. Private. While these lines might be helpful to categorize existing applications or security policies, they won't be as useful in the future when applications use a hybrid mix of services. CIOs need to begin looking at ways to enhance skills that understand security in any domain. They need to understand how to build applications in multiple environments, and plan to have those applications potential move to other service-sources over time.
- Speed Kills, So Embrace It - Too often CIOs will get caught up in the initial costs of projects, focusing on ROI of CAPEX with a mix of OPEX improvement. But Cloud Computing's greatest attribute is the ability to deliver new services faster, while also allowing reduced "time to failure" or "time to experiment". The public IT markets already enable speed, so the new CIO needs to figure out how to enable this in any environment. Embracing this speed element will open up new business opportunities.
- Be the Single Source of Visibility - There are three ways to deal with "shadow IT", those smart users that bypass IT systems/control because the answer to their needs is too often "no". (a) Ignore it, assuming that it's probably not prevalent, (b) fight to control and contain it, using "policy" or "compliance" as the barrier, (c) accept that it fills a void in existing IT services and strive to have a level of visibility to it's usage. By adopting a plan that enables "c", the CIO can deliver a level of visibility that is valuable to both IT and the business, as well as create a stronger partnership between both groups. This may be as simple as starting to offer a self-service portal that includes options for various Public Cloud services (AWS, Google AppEngine, CloudFoundry, etc.) as well as internal Private Cloud services. The visibility enabled by the portal may be the perfect start to assess the need for new IT capabilities, or better business dialog with the lines of business.
- Speak the Language of Services - The Web 2.0 culture has enabled a set of expectations that there are no single-function services anymore. All services are multi-faceted and enabled through the ability to connect with other services (social networks, mobile networks, big sets of data, video, etc.). This is done through APIs - the languages that interconnect applications and share data. Bringing these skills in-house to your IT organization will provide the framework to think in terms of "services".
[EXPLANATIONS FROM TOPICS AT THE BEGINNING]
Public Cloud Computing - A defined in Nicholas Carr's "The Big Switch", the long-term economics and operational models that succeed in public Cloud Computing will eventually become the de-facto computing model for all of IT. Does this mean everything will be public Cloud Computing? No. But it does mean that almost every instance of private Cloud Computing will adopt the best practices that allow business the flexibility and speed that are available publicly today.
Mobile Computing - When VPNs and laptops emerged in the late 1990s and early 2000s, workers struggled to find the blurry line between "work" and "life". That line no longer exists as mobile computing now allows people to not only have access to work 24x7, but any information at anytime, anywhere. And in many cases, that information is interconnected to other applications (world news, Twitter, Facebook updates, sales trends, global suppliers, etc.). And if you're spending 24x7 with anything, you'd better love it. Put those two trends together and CIOs have a massive new challenge on both the DataCenter/Application and EndDevice/User sides of the equation.
Consumerization of Devices - In the past, the corporate lease on devices was 3-4 years. The first year or two, a worker was generally accepting of this device to get their job done, but the end of the lease was a painful experience. Now there is a new i every 6-12 months and the jealousy level is enormous towards colleagues that get it. And those old rules of qualifying a small number of devices and saying "no" to everything else? Those days are ending very quickly as user bring more and more devices into the workforce, further blurring the lines between work and life.
Connected Applications - Regardless of the boom and bust nature of "Web 2.0 Companies" (all those ones without vowels in their names), the model they created has revolutionize how applications are built and set the tone for how users expect to experience computing. Information can come from anywhere; it can be mashed up, customized, and opened to others through APIs. Most likely it will be a mix of public and private information, so it means your data and access security models will be flipped on their heads. Throw in that some of these applications will be purchased and some will be a mashup of SaaS applications and you'd better be prepared for new costing models that mix together CAPEX and OPEX.
Big Data Analytics and Analysis - The late 90s and 2000s were all about the ability to expand your business via new networked applications. Just having the network in new places (home, remote office, coffee shop, vehicle, etc.) meant new opportunities. But that thrill is gone, with people now complaining about upload speeds at 30k feet. The new frontier is figuring out how to retain, move and analyze the massive amounts of data created between those networks. The explosion of open-source (eg. Hadoop) analysis frameworks mean that not only can your business ask new questions of itself (or the market), but the ability to get answers is less expensive than ever before. And if there is any question that "Big Data" is a real trend, just check out all the companies getting VC-funded these days - they may appear to be "social networking" or "mobile apps", but behind every one of them is a massive engine analyzing the "interconnectedness" of all that data generated by people that may be similar to your customers.
Moving Applications to the Network - Your users are no longer in well-defined locations (HQ, Regional Office, Remote Office), instead they are everywhere. And the data they need to connect to could be anywhere due to interconnected applications. And unfortunately, even with caching tricks, nobody has figured out how to make the network exceed the speed of light. So bandwidth is becoming a bottleneck, especially when trying to move data from centralized locations to distributed users, or new "follow-the- models. So creative IT teams are beginning to explore models that move the applications closer to the network.
Move Data to the Computer - The last decade has seen an increase the scope and size of Storage-Area-Networks (SAN) as IT organizations attempted to central the storage and management of data across multiple applications. Then virtualization technology really accelerated this over the past five years. But now "Big Data" is starting to question this approach, due to cost, complexity or performance reasons. Maybe big gobs of local data (on-board disks in servers) is a better approach for those applications? Or maybe the answer is a mix of on-board, cached-data (somewhere) and SAN? These models are still being worked out, with leading practices being discovered in both Public and Private Clouds.
(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)