In a recent article, I discussed the cost trade-offs of using AWS Reserved Instances compared to On-Demand Instances, especially in light of AWS’ habit of reducing prices of the latter. We saw that the savings promised by Reserved Instances dissolve if On-Demand pricing is reduced.
I also lamented the fact that AWS’ approach of asking for money upfront for Reserved Instances violates the spirit of cloud computing, which is supposed to have a “pay as you go” approach.
Keeping Costs Low
So, how do you keep your costs low without giving up “pay as you go”? There are a few options:
- You can use utilization metrics and advanced analytics, such as those in 3rd party products, to suggest more appropriate instance sizes based on your actual usage. A 2013 study by the Big Data Group showed that average AWS instance utilization rates are only 8-9%. So, if you’re running an m3.2xlarge at $0.56/hour, you might be able to get away with running a t2.medium or an m3.medium at one-eighth to one-tenth of the cost.
- Another approach is to use a cloud management platform, such as Ostrato’s cloudSM™, which allows you to place parking calendars on instances, shutting them down during off-hours. For example, in a development environment that is active from 7:00 a.m. – 7:00 p.m. weekdays, but shut down evenings and weekends, a parking calendar could reduce your spend on every instance type by 64% per month.
- Lastly, you could just use a different cloud provider with lower On-Demand pricing, such as Google Compute Engine.
Since the first two alternatives will work in any cloud provider and with any instance size, let’s look at the last alternative.
Taking Advantage of On-Demand Pricing through Google Compute Engine
In my last article, I used a c3.large AWS instance to compare Reserved Instance and On-Demand pricing. For continuity, let’s continue to use that instance size and compare costs with a similar Google Compute Engine instance.
An AWS c3.large is a two vCPU, Intel Ivy Bridge instance with 4 GB (3.75 GiB) of RAM and two 16 GB SSDs. On-Demand pricing for a c3.large is $76.86 per month, assuming 7×24 use in U.S. East. Finding a comparable Google Compute instance is a bit tricky, as they don’t publish their processor types. Therefore, let’s look at their n1-standard-2, which has two vCPUs but twice the memory and their n1-highcpu-2, which has two vCPUs and roughly half the memory.
These instances, along with their effective monthly costs, are shown in the table to the right. Note that the Google Compute pricing in this case ranges roughly 10-40% lower than AWS On-Demand pricing. The latter savings is comparable to AWS Reserved Instance pricing, but without requiring an upfront investment and without sacrificing “pay-as-you-go”.
The Google Compute instances assume 7×24 monthly usage, along with Google’s sustained use pricing. If you’re not familiar with sustained use pricing, the longer you use your instance any given month, the deeper the discount.
One other thing to note: AWS requires a 1-hour minimum, which means that if you start an instance, you’re billed for the full hour. Google Compute Engine only charges for a 10-minute minimum, then bills you for every minute after that, which is truly on-demand. If you have several instances which run intermittently throughout the month, Google groups them by instance type and zone, in a non-overlapping way, into what they refer to as an “inferred Instance”. This may still allow you to take advantage of sustained use discounts.
Getting the Best Savings
We’ve talked about a few ways to reduce your spend without sacrificing “pay as you go” (i.e., On-Demand). The first two approaches, instance right-sizing and parking calendar policies, help you optimize within any cloud provider. Ultimately, though, your best savings may come from using another cloud provider. We saw that Google Compute Engine’s approach provides On-Demand pricing, which is less expensive than AWS, and in some cases might be comparable in savings to AWS Reserved Instances, without requiring “pay as you go”. In my opinion, it’s worth a look.