Wednesday, March 05, 2008

Feet in the Data Center but Head in the Clouds

Okay, you get it, I'm interested in Cloud Computing... And I have a few things to share on that in general. But the last few entries are really focused on using cloud computing as a solution to fixing what is growing into a mounting crisis, or at least a danged annoying trend of spending more time managing our computers than using them. But there's also another reason that clouds are very interesting, and I think this summary on MSNBC of the work that Google and IBM are doing to start training the next generation of thinkers and problem solvers.

In part, the core of the article is really about thinking about how to use our compute resources to more effectively solve problems. I'm reading over and over about the time it takes for someone to start a new project inside a business. Usually the first step is to find hardware - and enough hardware - to do something interesting. Then there is the location, the configuration, the network access, the account management, oh, what software did you need installed, by the way is there enough power there? Is your new project mission critical? Did you think about backups? Do you need something more reliable? And soon, the mere thought of creating a new project gets plowed under by the gnarly logistics of just getting prepared for a project.

The cool thinking in this google driven model is that it transforms compute power the same way that the Internet transformed our connectedness. Keep in mind that networks are this complex mass of ethernet and switches and hubs and WANs and LANs and firewalls and wireless and cable and goodness knows what else. In some way, the complexity of configuration is high - not as high as servers perhaps, but still not trivial. And, errors like having the power cord kicked out from a switch in a lab may only take down one of my IRC servers for a while, while the rest of the internet remains connected (yes, that happened today). But we don't yet think of compute power as a utility, nor do we think of it as a plentiful utility, which it truly is. If we added up all of the compute power currently in operation on the planet, well, the overall ability is staggering. Perhaps as staggering as the sheer volumes of data running around the Internet today, every day. And with the internet, it sends email, pictures, provides services, all without us really noticing the underlying utilities. We all have access to that staggering amount of network bandwidth, but we don't generally have access to that level of compute power, even though the vast majority of compute power in the world is seriously underutilized.

Google's thinking - and IBM's, Yahoo (er, MicroSoft), Amazon, etc., is that it is time to make these large data farms more accessible. And, to do so requires several large shifts along the way. One shift is to provide some of the fundamentals for managing the servers at a way that reduces the impact on the environment (or corporate pocket book) as well as the human cost for administering these systems. Another, and the focus of the joint education program that Google and IBM are embarking on, is to start educating people on ways that they can more effectively make use of the compute power - not just as a single machine, but as a utility which can be harnessed for increasingly larger challenges.

My day job is focused on building up the infrastructure to simplify the server environment management and ease the access to that server capacity. But in the long run, the shifts in education, the shifts in programming model, will arrive hopefully just as we have mastered the ability to deploy large cloud computing environments. It is definitely something to look forward to!



Post a Comment

Links to this post:

Create a Link

<< Home