As a follow-up to my previous post on “Adequate Computing,” I’d like to provide some thoughts on Cloud Computing. It’s quite the buzz-word these days (especially with regard to data security), but I’m more interested in its application for “adequate computing.” But first, let’s be clear what our definition of cloud computing is, exactly. For the sake of this post, by “cloud computing” I’m referring to any technology that allows for the offloading of computation, hardware requirements, and data storage from the user’s computer up onto a server somewhere.
Basically, it’s any technology that transforms the end user’s computer into little more than a terminal. That might be an extreme way to put it, but that’s the essence of the benefit cloud computing holds, in my opinion. Why? Because corresponding with the push for more and more server virtualization in data centers across the world, there’s the potential for “virtualizing” the hardware on consumer desktop and laptop computers.
First, let’s highlight some of the major drawbacks to cloud computing before we get our heads stuck in the clouds (OK, I’ll admit it: pun intended). Cloud computing means moving data across the internet. Sometimes that data is unimportant, but some data requires encryption before it can safely be passed along to a server in the cloud. While the technology for such encryption is readily available, it isn’t always implemented properly. That’s where the information security and legal issues crop up (which I will be writing a post about in one of my other blogs: Digi-Docket).
And then there’s the issue of downtime. What happens when you can’t reach the cloud? There are some remedies to soften the blow of an internet outage’s effect on cloud computing (locally-cached data), but in scenarios where the actual hardware for the calculations is located in the cloud (for example, re-encoding an uploaded video to YouTube into a Flash version) you can’t get anything done in the meantime. Data can be cached, but hardware and computing resources can’t.
So, there are certainly some major issues with running everything in the cloud. For example, I work at Case Western Reserve University, which is currently in the process of switching from a localized mail server and calendaring system (iPlanet and Oracle, respectively) to a system provided by Google for free to educational institutions: Google Apps. On several occasions, the connection to Google Apps was down for several hours at a time. This wasn’t just affecting Case Western either; it was widespread among various Google Apps users.
There are some ways, such as locally storing your email in a POP3 client or using Offline Gmail, to locally cache the data. But for those using IMAP or the Gmail web interface (sans Offline Gmail) the outage can be crippling for productivity. And after experiencing several of these major outages early in Google Apps adoption, the problem has left a bad taste in some people’s mouths about “the cloud.”
But looking beyond the technical difficulties and outage scenarios, what does Cloud Computing mean for the future of computers? Or, to avoid the all-to-common pitfall of trying to predict the future and failing miserably, let me rephrase the question: What technological transitions will a widespread adoption of Cloud Computing make possible?
For starters, the transition already taking place throughout the world to smaller and smaller electronic devices like netbooks and cell phones will eventually displace desktop and larger laptop computers. And what should make that transition more feasible is the adoption of Cloud Computing technologies to fill in the gaps of hardware that only a desktop or powerful laptop could provide.
Online data storage and backup is quickly replacing local storage systems for small to medium amounts of data. For example, I use a service called DropBox (free for use up to a 2GB limit) that lets me sync data between computers (Mac, PC, Linux) and yet also access the files from the Web. It also keeps a backup of your data, in case you delete something important. It’s a very cool service that mitigates some of the problems of cloud computing by locally cached data on each computer. On the other hand, large amounts of data (digital video, system images, etc) require too much time and bandwidth to be feasible . . . yet.
A perfect example of “virtualizing hardware,” the startup company OnLive plans to provide gaming services that remove the need for a console or expensive gaming computer – they do all the heavy-duty rendering on their servers, which are more efficient at rendering a multiplayer game on one big machine rather than dozens of home computers. It also removes the guesswork of supporting a myriad of unique combinations of hardware and software found on consumer computers. It’s not mainstream yet, but it holds the potential to change the way games and other resource-intensive computational tasks are handled in the future. Computers of the future may not be measured by their graphics cards or processors, but by the speed or low latency of their Internet connections.
And for all the apocalyptic speeches about how the Internet is going to die, the trend over the past decades has been a steady increase in Internet connectivity throughout the world. Given the rise of video content-rich websites like YouTube, Hulu, and countless other video streaming services, there is certainly a good business incentive to provide the necessary bandwidth to support these popular services to customers (see the related Net Neutrality discussion on Digi-Docket). So even if cloud computing itself doesn’t drive the increase in available Internet bandwidth, perhaps the “intertubes” will have already been opened up to allow cloud computing to take advantage of it.
So, while it remains to be seen how much of everyday computing tasks take advantage of cloud computing, one thing will be clear: the cloud is always getting bigger.