There has been a lot of buzz around the “cloud” during the past few years, and I’ll let you in on a little secret: I’m pretty sick of it. Don’t get me wrong, I am actually a huge fan of “cloud computing” (Can we stop calling it that?), and I leverage it every day in my professional and personal lives. The problem with the cloud, as I see it, is that it is largely misunderstood by consumers and over-hyped by many vendors. These two issues go hand in hand, as all too often marketers are using consumer ignorance to their advantage. The situation becomes amplified when the marketing departments of companies don’t quite understand the situation, and manage to persuade companies to move “to the cloud,” even when it makes horribly little sense. But I digress. Let’s start with this: Just what is this cloud thing anyway?
I like to keep things simple: The cloud is remote-access computer resources. I’ll dial it back even further: The cloud is a computer somewhere other than where you are that you can store files on or even use to do some compute work remotely. Sure, it’s more complicated than that if you consider the scheduling and management software required to allow thousands of consumers to interact with thousands of computers. But, in the end, the actual benefit to the consumer is nothing more than being able to conveniently leverage computer resources that they do not own, manage, or house.
Indeed, the cloud is just someone else’s computer that you get to borrow via the Internet. That’s right, welcome back to the future, where all you need is a thin client connected to a terminal. Hello? The 1980s would like their idea back. But the cloud sounds so much cooler than thin client computing, right? Okay, maybe not.
“Cloud computing” is the industry’s new buzzword; it seems that you aren’t doing things right unless you have your business in the clouds. While not all applications and methodologies are meant for remote-access computing, some are ripe for this process.
If the cloud really is that simple, why are so many companies rushing to push their applications into it? I’d like to propose that there are three main categories of corporate cloud compulsion (sounds like a disorder because it often is). First, there are those companies that legitimately have a cool use for remotely hosted compute resources, such as Backblaze or SalesForce. Next, you have companies that see a tangible benefit in the area of licensing and maintenance, and simply want to jump on the hype bandwagon. And, finally, you have companies that are merely chasing after hype or are so afraid of missing a trend that they blindly chase after it without understanding how or why it would be useful to their consumers.
In the wrong hands, the licensing and maintenance could be really scary, and that is the one category I’d like to focus on because I think it is what will drive a lot of improper cloud offerings in the next few years.
SAAS (software as a service) has been a hot topic among VC (visual communications) and MBA types for a few years now. By allowing users to run software via a server that the company controls, that company can monitor customer usage directly and charge you accordingly. Presumably, the firm can prevent piracy at the same time, since you no longer install applications on a local system. Used responsibly, this can be advantageous to both the vendor and the customer, as it allows a sort of “pay as you play” way to work with an application. What could go wrong, you ask? Here are some examples, but before I begin, let me preface this by saying, “Yes. I know these things may be rectified at some time in the future.” And we’ll get to that. For now, let’s talk about the present.
Using a remote server to use your mission-critical application puts you on a tether. You must be connected and have a fast connection for a reasonable experience. At this point, you have just introduced a single point of failure into your workflow. Not super exciting for someone who spends eight hours a day in front of his or her main applications. Perhaps this is not too scary if you have a nice fat Internet connection at home or the office, but consider traveling or just working remotely for the day as a change of pace. I, for one, do not want to depend on my local Wi-Fi hotspot in order to work reliably in my 3D application of choice. Even with a consistent connection, latency continues to be the plague of fully interactive cloud-based tools. One could point to the fact that games and MMOGs (massively multiplayer online games) have been running on the “cloud” for years now, but generally the only thing happening on the remote server is inter-player coordination. The game itself is still running locally on your system.
Another important factor to consider is intellectual property. You might be the sort of person who is comfortable hosting your data remotely. But consider this: Not everyone is working with data that they own. If I am a freelance artist working on an illustration for Disney, how will the company feel if I am creating the artwork leveraging a server hosted by a third party? This is not an intractable problem, of course, but it is one of many niggling issues that have yet to be worked out. In fact, I recently watched the reaction of many professional 3D users when a software vendor announced that the Amazon EC2 network was to be used as a renderfarm. The idea seems perfect. The rates are very inexpensive, and you have thousands of cores at your disposal. But the honest reaction from the users was, “This is really cool. Too bad I can’t use it.” The reason was IP safety and contractual obligations with their clients.
At the same time, an even larger company announced that it would allow a trial version of its main application “via the cloud” (that is, running on a remote server). This, the firm claimed, would be a perfect way to evaluate its application. Mind you, this is a many-thousand-dollar application that requires significant hardware. I can understand how it came to this idea, and I can nearly see the panicked mid-level product marketing team struggling with ways to “leverage the cloud” to extend business.
Cloud computing enables smaller businesses or those with fewer resources to do large-compute tasks.
After all, this is a hot topic. We simply must act! Right? No. This is perhaps the most egregious example of corporate cloud compulsion. Hosting a monolithic 3D application on a remote server is not a workable long-term solution for any professional 3D artist, so why on earth would it be a reasonable way to evaluate the application? Here at Luxology, when running trial software, we need to run it locally on our own hardware so we can assess what sort of real-world experience we will have. Suffice it to say, the idea fell flat.
Fear not, it isn’t all doom and gloom. As I indicated earlier, I am a big fan of remotely hosted computer resources (Say it with me!), or we can call it RHCR for short. The key is to use this newfound power responsibly. We all know that with great power comes great responsibility. If we consider that for the foreseeable future our Internet connections will not provide 100 percent uptime, will not be available all the time when we are away from the home or office, and will not provide content-creator-style bandwidth anytime soon, then we can begin to understand how we and our software vendors should leverage this RHCR.
One successful RHCR strategy is to provide a service that can work without intervention and can sustain breaks in connectivity. Data backup is perhaps the best example of this. The second reasonable use for RHCR is remote compute power. I know I said many people rejected the idea of rendering via EC2, but there are also many people who work with smaller clients and will be able to leverage this sort of on-demand power. The advantage here is that the remote system goes off and performs its duty without necessarily requiring a constant connection. When the compute is done, the data is piped back to you and everyone is happy, which is a good usage scenario for RHCR.
As you may have determined, the most challenging aspect of leveraging the cloud is interacting with a remotely hosted application. The more complex the application, the more complex the problem, mainly because people who use complex applications tend to sit in front of them for hours and hours at a time. I believe there is a place for cloud computing and interactive applications, but for them to be most useful to the consumer, they should be highly focused applications that the user would generally interact with for less than an hour at a time and probably not every day. Once you get to everyday usage, I firmly believe that running the application locally is a couple of orders of magnitude more reasonable.
I would like to see more companies being creative with their cloud offerings. Rather than just cramming their existing monolithic applications onto a remote server, why not find new ways to provide focused technologies to a larger base in a way that actually makes sense for the cloud today? But that’s just me.
So where is all of this heading? What is the future of the cloud and 3D? The bandwidth will undoubtedly become broader and our connections more stable. Wi-Fi will continue to push further into the wide-area domain. In my utopian view of the world, Internet access will become a basic utility, like power or water, and will be just as available. When it becomes fast, constant, and ubiquitous, we can relax somewhat about that single point of failure and move evermore toward ultra-thin client network devices (the iPad is a forerunner in this area). When I have the confidence that broadband connections will be as readily available as power when I am out and on the go, I’ll be happy to give up lugging a multicore laptop in favor of a paper-thin OLED device with multitouch to run my 3D application from the cloud. Until then, it’s my firm opinion that we all need to get back to reality.
Brad Peebler is president and co-founder of Luxology, with 20-plus years of extensive industry experience.