Thursday, April 14, 2011

Seeing the forest through the trees: The Cloud through The Storm

Every time I turn around now, I see so much news and info about "The Cloud" that I'm starting to get asked more and more frequently, "what is all this Cloud stuff about, Brandon?" so I want to break down some important elements of "The Cloud" for everyone to get a little less nebulous definition of what it is.

You've got to start with a little understanding of why IT people even invented "The Cloud" before you can really understand the different major parts of it, and how they all interact to make this virtual storm that's flooding our living rooms with ad's that make no sense to most people, and baffle many IT people still today.

Back in the 70's and big time in the 80's, mainstream businesses started doing massive deployments of computers. They had to remain competitive, and these systems promised to increase productivity, lower costs, and pay for themselves. Mostly back then, the PC as we know it, was a stand alone computer that people rarely networked. The big players where Unix, and Novell. These systems gave people the ability to work from "dumb terminals" that stored no information, and merely received letters and numbers from the server, and displayed them on the screen for the user.
These kind of "networked" systems operated where all the information was stored on a set of central servers, and then was pushed over serial-cable (very slow) to a user's terminal.

Now back then, software applications where virtually non-existent. You couldn't easily find a great "app" for what you where looking for. The solution was to hire computer programmers, who could customize the software to meet the needs of your specific business.
Inside of most businesses, there aren't really any rules of how to standardize this practice or that. Most businesses of the day came with processes either en grained from "processes from decades ago" or where designed around some set of methods that did -not- previously have computers in the process of.
This meant there was a giant market opportunity for programmers to capitalize on every business because computers could significantly improve business process methodologies.

Over the next decade, more and more software was developed, but the computer system architecture grew at lighting speed. By the 90's, major software vendors had emerged, and by cobbling together all the best-of-breed business processes from their thousands of clients over the last decade, these software giants started selling fully-integrated suites of business applications that emerged on PCs first.

The PC took off and so did the world of computer networking. In the 90's is when we saw the fast development of PC based networks, and the Internet.
The PC had amazing processing power, and run a powerful operating system that you could easily install a wide range of software to. The software developers had written thousands of applications over the last couple decades, and they adapted the best of breed to the PCs. Suddenly, powerful software that end users could use, came to low-cost, networking PCs. Another business and consumer IT spending spree began.

What made PCs even better was their cost was significantly lower than their Unix based counterparts.
Now even smaller businesses could use computers to compete with their larger, wealthier competitors. A large sector of the market started moving towards these PC based systems. The PC offered greater ease of use than character based Unix, great flexibility, the ability to install your own software from a wide range of exciting choices, and the list goes on and on. There where so many reasons to upgrade from the IT systems of the 70's and 80's, that a vast majority of IT purchases during the 90's and beyond, would be made on PCs.

But the PC wasn't the computing utopia some people hoped it would be. It brought all kinds of new issues. For one, data started being accumulated all over the place, software was only accessible on the computers you installed it on, and flexible remote access - a norm in the 80's, became much more difficult when you introduce these graphical interfaces we use today.

Distribution of computer resources and power has come to a point where it's Inefficiencies
had to be countered.

Enter one definition of "The Cloud". The idea of taking the software you run, on your computer, and getting it to run in a manner that makes it distributable, accessible, secure, and in a way that doesn't take your computer.
Actually The Cloud is a model that looks remarkably similar to the IT infrastructure models used in the 80's. Big computers serve actual software applications to end users. Groups of users collaborate together in networks that share certain data between these applications.

But there's so much more too it than that.

The cloud is an idea - a methodology of computing, and it's shaping up to be quite a storm.

The computing methods of the 90's and the last decade are time for a major upgrade. When was the last time you upgraded your computers at work? It was either recently, a couple years ago, or your really due. Computers are constantly going out of style. They have to get replaced all the time. The capitol investments that companies make into their IT hardware alone accounts for billions of dollars spent every year, and the cycle never seams to stop.

Why do you need to upgrade your computers?
  1. My software runs slow
  2. I can't run the latest version of my software on this computer
  3. My hardware is really old and crashes allot
Software is the #1 reason to use computers. The same software powers our productivity. So when our computers run slow, our productivity suffers.

Every year, software gets heavier and heavier, and computers get faster and faster. It's no wonder that running software on our computers slows them down because the newer, faster computers always run today's software great - but who can afford to upgrade all their computers every year? These days, the life of the average PC is getting long and longer, so it's having a big drag on user productivity.

Setting up new computers is extremely time consuming. Given, in large companies with heavy standardization, there's a process in place to help reduce migration complexity because it's a constant task, but in mainstream USA, the average PC migration task can take 4-6 hours, if preformed by a good technician with a good plan. I've seen complicated end user computers take double that many times. So with all these costs, what's the answer?

Confounding things more is when a user's desktop crashes, it's common for some critical data to exist there - in a business network environment it's really hard to keep physical copies and images of all your end user desktops. We can do it, but storing that gets expensive, and the load it puts on the network alone makes it a double edged sword on all but the most heavily invested in gigabit or fiber networks.

Back in the 80's the "Servers" delivered the applications directly to the end users. In the 90's when Microsoft came into the picture big time, that kind of stopped. Suddenly the way to go was moving "files" from the "server" to the PC where the end user's software would process the data.
Higher end applications hosted databases that many users on a network could access at once, but the standard method these software developers used was to install a physical "App" on each computer on the network.

Now, with "The Cloud" the idea is to move that processing load off the end user's desktop, and put it back on the server.
There's two major approaches being worked on right now with this path:
  • Put the Applications them self in "The Cloud" - the user gets them from a web browser
  • Put the PC Operating System (like Windows) directly in the Cloud. The user accesses their environment remotely and runs their software
An example of running your software applications in the cloud might be Salesforce.com - they have a highly integrated, widely deployed "Customer Relationship Management" application that runs entirely from your browser. The big deal is that it can be made to actually integrate with all the software your running in your business today - your database servers like Microsoft Exchange, your accounting systems like Quickbooks, so on and so forth. The high level of integration, coupled with it being a "web based application" made it a huge success when it debuted.

In the case of accessing your desktop environment remotely, many people are familiar with that today by using tools such as Microsoft RDP also known as Terminal Services, or Citrix which falls into a similar category. Other solutions like LOGMEIN, GO2MYPC and others, have given many users untethered, unfettered access to their desktops when their away.
Businesses have been deploying servers that let people gain remote access for years, so that's nothing new. What is new is that now the Internet has become fast and stable enough to really optimize some things for delivering virtual desktops to end users, and letting them do all their computing that way.
Again, that's been going on for years tho already. What really makes this "a new way of doing things?"

The answer is a technology called Virtualization which has emerged as a major game changing technology

over the last ten years. Really the last 6-8...
So you might have heard about companies like VMWARE that sell these products which allow running "virtual" or "emulated" computing environments.
Back in about 2001 had a copy of VMWARE installed on my XP Desktop. I was able to "emulate" virtual hardware as if I had a whole different physical PC sitting there. I created several of these "virtual hardware machines" and then installed a bunch of different operating systems on them. I would start up my XP machine, and then I could run vmware and boot up a Windows 2000 box, Windows 98, NT 4.0, other copies of XP, and I could even boot them all up at the same time and network them all together - and do it all from windows I could just minimize on my XP desktop and I could continue using my computer normally.

This was useful for many purposes, mostly testing things. It really slowed down my computer, and had some stability issues and other problems that made is less than useful for "mission critical server workloads" at the time.

Over the years that got fixed. Then computers got more powerful. And now, big huge datacenters with super crazy fast Internet connections can serve up virtual servers and even desktops, to end users.

So, in this usage scenario of "The Cloud" we are talking about running really big, powerful servers in a big stack, like a "computer cluster" and then chopping up their collective power across "virtual" chunks that are your servers. If a server "node" goes down, the cluster can dynamically reassemble itself with minimal or no interruption. Basically, it's self healing. It's also incredibly redundant and very difficult to break.

But where is "The Cloud"?

It depends. Are you talking about a "Private Cloud?" or a "Public Cloud?"

Here's the deal, there are two major ways the IT community builds clouds:
  • You own it - example VMWARE, Zenith Infotech SmartStyle
  • You access it - example Microsoft Azure, Amazon Web Services, Rackspace
Situations like Salesforce.com or hosted-web based solutions aren't in the list above because I'm not talking about using solutions built IN The Cloud, I'm talking about ways to actually build the cloud itself.

Owning a Private Cloud
Not comfortable with your data living on a set of servers you have no access to, or limited knowledge of the physical geographical location of? Worried about future news stories breaking that say, "major cloud breach! data stolen!" and don't trust encryption? Maybe you have some regulatory issues to deal with, and can't find a public cloud provider who can meet your compliance requirements.

The private cloud is simply a set of computer servers you buy for your business and then install, either onsite, or in a hosted "data center" where you or someone you know at least, typically has physical access directly to your hardware. Companies like VMWARE and Zenith Infotech have pre-fabricated boxed "private cloud" packages that basically consist of a manufactured cluster of servers running a suite of virtualization software.
This hardware is a fixed capitol expense with ongoing maintenence and hosting costs typically.

The Public Cloud
So in this case, you are actually leasing or access a set of computing resources that run on a set of "virtual" infrastructure which is provided by a large hosting provider.
Some very large companies have been building some pretty substantial public datacenters recently. This is really interesting because it gives a broad range of computer users, and businesses, access to a highly scalable, incrementally priced computing resource.

One company who came to the market a couple years ago was Amazon. They had huge datacenter facilities and the in house expertise to deliver this kind of new computer access model. Suddenly companies could lease a server right out of the cloud and pay as low as pennies an hour for it, while it was running.
Not to be outdown, Rackspace joined The Cloud revolution launching a whole suite of similar products on a very similar platform to Amazon.
Amazon, Google, and Rackspace, among others, have built a whole framework of cloud-based products that enable businesses to tap into everything from highly accessible and secure storage to application load balancing and running applications directly from The Cloud.

That's another interesting usage scenario - running your software directly from the cloud without really having the operating system layer in the mix... That is about as alien of an idea as you can get for most IT people, at least it was a couple years ago...

Microsoft decided it wanted to get into this Cloud stuff a while back. They smelted a product called "Azure" which was pretty mysterious when they first started talking about it. They are taking a little gamble here too, because this is an absolutely massive shift in the sands of IT their going for.

It's called "Platform as a Service" and it goes a little something like this:

"What if we could get the programmers to reprogram their software, just a little bit (assuming it was built to run on Windows) and get all the software to basically run from the web browser, and all the databases just get hosted up in our servers, then all the processing just gets delivered, but the user, or user's employer actually, never has to even worry about any "server" operating system or hardware at all - because it's all just taken care of by us, Microsoft, up here in The Cloud."

So this is a huge change - instead of hosting software on a server, your hosting software literally on the fabric of the cloud, and the computing resources are just being dynamically allocated by this infrastructure. All the while, this whole "platform" is being delivered from this huge provider, like Microsoft.

Now Azure is much more than just that. It can deploy server operating system instances now too. That's really helpful for businesses who want to start tapping into Microsoft's new Cloud platform, and don't have this fabled "cloud software" that Microsoft is campaigning it's developers to write. It's called an API - Application Programming Interface.

At this point, why should I even bring up API's ? It's a whole different conversation, except in this context, I think it's important to mention a little something about them...

Microsoft is pushing their API for Azure.
Guess who else is pushing a Cloud based API?

Well just about all the major cloud players out there. Amazon has an API, Rackspace coined one called "Open Stack", Google's got an API for their App Engine service which competes with Amazon Beanstalk, and both Beanstalk and AppEngine are poised to be competitors to Azure itself. So as you can see, the API world for these Cloud providers is heating up.

Microsoft, knowing this, knows that a massive wave of Cloud Programming is on the horizon, so naturally, it just makes sense that along the way, .NET developers, VB types and all the MSDN guys will inevitably add tags for the Azure API because everyone else is fixing to start doing massive programming in Cloud API's for all these providers anyway.

Who are these providers exactly? I've been asked that very question recently, and that's actually what inspired this post - so thanks Frank.

Here's the list of providers who are really getting into the Public Cloud Market:
  • Amazon / Beanstalk, S3, EC2, etc
  • Rackspace / Openstack
  • Google / AppEngine
  • Microsoft / Azure
Some other providers who are doing cloud servers include people like 1and1, Godaddy, GoGrid, and many others. The reason these platforms are all called "cloud" servers, is because they are servers that have some kind of incremental cost and are upgradable basically at the click of a button because they run on virtual infrastructure which is built on a cluster of shared resources at the providers datacenter.

The Cloud is really a storm of methodologies that are changing the way people access and use their software.

The Cloud is nothing new. It's been around for years. It's gotten more press lately, as companies like Cisco and Microsoft put ad's on TV, and conversations about it happen in corporate boardrooms more and more every day.

Hopefully this article helps someone understand The Cloud a little better. It's got a wide range of elements that are still forming. Learn more about tapping into the Cloud from NODE. We can help take your business there and figure out how to navigate this huge change in IT.


No comments:

Post a Comment