Bit of a cloudy week with the discussion of risks earlier and now a related post on enterprise clouds.
Even though I’m not a fan of the term, I suppose I’ll keep using it, like Web 2.0. It occurred to me that like a lot of consumer stuff, service-based-computing should thrive inside the corporate firewall, because of the Slanket of trust we have with our employers.
What do I mean?
I’m talking about infrastructure. Just as cloud computing has lowered the cost of starting businesses, cloud-based infrastructure should lower procurement costs for enterprises and bolster corporate security.
Small divisions within large companies have disparate sets of needs, e.g. project management with MS Project may work well for some teams that have the licenses and the prerequisite skills, whereas other teams might prefer Basecamp, or an open source option. They might even have the chops to roll their own solution.
These disparate needs quickly become nightmares for procurement and corporate IT/IS. As cloud-based, web app and open source options continue to push out great feature sets at affordable prices or for free, centralized processes struggle to maintain the level of control they’ve become accustomed to over the past decade.
In most companies, if you want to test out new software, you need hardware and physical space first. To host a web app, you’ll need networking setup–data center space, hostnames, DNS, possibly certs and maybe even gear. All this before you even get started.
These processes don’t always move fast enough for your timelines, so services on the consumer web become attractive because they’re affordable and managed for you. However, what if your corporate security forbids you from using services outside the firewall? You’re stuck with a dilemma. You can go through the processes to procure hardware and get it online, or you can change your thinking to fit what’s available now.
What if your company had an EC2-like service inside its firewall that made it quick and easy to get a virtual machine up and running? You go straight to testing that shiny new software, and if it works out well, you could let other teams try it out and scale your infrastructure to support them.
Add a backup service (like S3) inside the firewall and you could not only backup your application data, but you now have a standardized way to make it dead simple for employees to back up their personal data.
Central IT wins here too because they can monitor what’s being used and test out new software among their users in a centralized manner.
For example, maybe your company needs a pdf creator like JotNot because you’re moving away from paper copies of everything. But, security doesn’t want sensitive data on outside servers, natch, and you don’t want to manage the rollout of an O/S- dependent installable package.
Just put up any one of the many options out there on your cloud and provide this as a web app to all your users.
Or, maybe you have a platform that you want people to use for application development. Put it up on your cloud with a cloud-based IDE, like Bespin, and you’re all set. You’ll soon have a company app store, also hosted in your cloud.
A change to infrastructure in an enterprise cloud wouldn’t be totally push-button-start. You’d have to have a decent datacenter and good pipe. Plus, you’d need NOC skills from IT, and you’d still have processes.
Still, this feels like a good idea. As I think about it, I have to assume someone is doing this already. If you know of an option, let me know in comments.
Did I miss something obvious? Don’t agree this idea would work?
Find the comments.