Don’t Overlook Apps In Sizing Cloud Vs. On-Prem Costs

cloud costs, cloud

ISG reported that running the proprietary Linux distributions made available from the four cloud providers evaluated — Amazon Web Services, Microsoft Azure, Google Compute Engine and IBM's SoftLayer—will usually deliver savings.

The monthly cost of running instances of those operating systems non-stop varies from $532 to $738, compared to $510 for comparable on-premise environments, regardless of usage.
Considering the average "public cloud Linux" price, Jones said, customers would have to run compute instances more than 78 percent of the time before costs eclipsed those for private clouds in their corporate data centers.

Jones said the most interesting finding regarding proprietary Linux instances was probably the price variance between the four public clouds scrutinized; at high usage levels, the cost of standing up a virtual server differs by 40 percent between the first and second lowest-priced providers.

But ISG isn't sharing its rankings just yet, instead only presenting for the sake of the index the average of the four. (A paid version of the report, which will be released next year, will get into those provider specifics.)

The cost of Red Hat Enterprise Linux is in line with that of Windows Server — considerably more expensive than the standard distros.

Running Windows instances round-the-clock costs users between $768 and $1,096 per month, depending on the public cloud. For Red Hat's operating system, it’s a narrower range: $796 to $910.
For both of those enterprise-grade operating systems, once usage climbs above the 55-percent threshold, the cost equation tips in favor of on-premises IT.

But even that more nuanced, application-minded cost analysis is still not an apples-to-apples comparison of cloud versus the corporate data center, according to Alex Brown, CEO of 10th Magnitude, a cloud-focused solution provider based in Chicago.

ISG is absolutely correct in emphasizing the need to focus on applications when choosing between internal or external environments, Brown told ITBestofBreed.

But Brown adds that it's even more important for customers to understand that cost is not the only factor - or even most important - when they make those IT decisions.

"At the end of the day, (when making) the decision about how best to use cloud and when to use cloud, cost is not one of the best determinants there," Brown said. "It's about the application, and delivering it quickly and effectively and making sure it's responsive to customers, not about which cloud and OS is cheapest."

The straight-up cost comparison often leads companies to the wrong answer, he said, and he usually steers customers away from such narrow analyses.

Instead, 10th Magnitude points its customers to the platform that will be easiest for them to use and impose as little overhead on their organizations as possible so they can deliver services faster to their customers, Brown said.

While ISG’s Jones also said discussions about cloud adoption should never be purely financial, enterprise customers will always engage in pricing exercises before they make IT decisions, and they should do so cautiously.

That means an understanding of the "inherent disposition" of their applications — operating systems, usage profiles, spikes and storage structures.

One common cause of false expectations is that some cost calculators made available by public cloud providers default to displaying the price of standard Linux instances, he said.

"What we think is happening a lot is there's an assumption cloud is always cheaper because they're assuming a public cloud version of Linux, not Windows or Red Hat, which is what they will need in production," Jones said.

The biggest banks, insurance companies, manufacturers and other enterprises are typically going to use Red Hat's or Microsoft's operating systems, with bills significantly departing from the values the cost calculators produced.

"You'll never get true cost using a calculator," Jones said.

ISG's Cloud Comparison Index should be used as a rule of thumb, Jones said, "but you will not know the true cost until you start running it in the public cloud."