Share on LinkedInTweet about this on TwitterShare on Google+Email this to someone

Fall 1915 – Albert Einstein is miserable. A World War was underway, his wife had left him, taking his children, and he had discovered a fatal flaw in his well-received special theory of gravity launched a few years earlier. But, in one of the true examples of mental heavy lifting by November 25th, he soared to complete his Theory of General Relativity. The world would never be the same again. Gravity, energy, matter and space were all explained and linked in a set of elegant equations. For 100 years it has withstood repeated tests and it is an integral part of what has produced our modern world.

What does this have to do with you and cloud computing? A number of years ago a fellow named Dave McCrory developed the concept of “data gravity” as an analogy to Einstein’s concept of actual gravity. Einstein had the brilliant insight to envision space-time as sort of a stretchy membrane that would “sag” in the presence of mass – like a planet. Objects near the mass would fall into this sagging effect. That interaction between space-time and mass is what we call gravity. Although we feel it as a “pull” we are actually falling to the center of the earth. (For a great recounting of Einstein and his theory see this NY Times feature).

In a similar fashion McCrory suggested that masses of data attracted services and applications the way gravity does in the real world:

“Consider Data as if it were a Planet or other object with sufficient mass.  As Data accumulates (builds mass) there is a greater likelihood that additional Services and Applications will be attracted to this data. This is the same effect Gravity has on objects around a planet.  As the mass or density increases, so does the strength of gravitational pull.  As things get closer to the mass, they accelerate toward the mass at an increasingly faster velocity.”

Now there is no gravity in the virtual world so what drives the Services and Applications to move to the mass of data? It is Latency and Throughput. The lower the Latency (how long it takes to access and manipulate the data); the greater the throughput. The greater the Throughput is, the more efficient, effective and reliable the application. Put another way: Latency is driven by the distance between your applications and the data they need. The closer you can physically locate the application and the databases it needs, the better the performance.

In the last ten years, two mutually reinforcing trends have emerged because of this “force” and they are accelerating the movement to the cloud. First, most new applications are being built for the cloud from the outset (next year, one forth will be in the cloud). Second, the cloud provides incredibly cost effective and reliable storage for your data. Therefore, lots of data and we do mean a lot is being placed in the cloud. The cloud providers assist this trend by making it easier and easier to move enormous amounts of data into their cloud.

Bottom line: your data is being moved to the cloud and your applications are being written for the cloud. The two activities are mutually reinforcing each other. The inexorable force of “data gravity” is driving all of this. There may be some who resist the idea that most computing is moving to the cloud, but the rate of change has gone exponential, just like the way you accelerate in a real gravity field. You should be ready for the implications – your IT will look nothing like it does today – and “may the force be with you!”

Share on LinkedInTweet about this on TwitterShare on Google+Email this to someone
Share This