
Interested in reducing your AWS Bill?
Tom Page Permanent, Freelance, DevOps...
A few tips I've been given by my Network (Part 1)
I often hear from clients and candidates about customers who've fallen into a trap of knocking up big bills from not using AWS services correctly:
https://www.theinformation.com/articles/aws-customers-rack-up-hefty-bills-for-moving-data
I asked my Network for some tips and general advice that I could share with anyone trying to avoid these nasty surprises.
1: Shutdown Unused AWS Resources
By using AWS OpWorks and Elastic Beanstalk to allow developers to quickly deploy and redeploy applications, they can do so with consistency, without worrying about how this affects the infrastructure 🏢
Obviously, ensure these development environments are shut down at the end of the working day and at the weekends.
2: Use the Right Storage Class
There are 5 tiers of Amazon S3 object storage available, which one works best for your means?
Amazon S3 Standard- A general-purpose solution - As part of the AWS Free Usage Tier, a user will get 5 GB of Amazon S3 storage, 20,000 Get Requests, 2,000 Put Requests, and 15 GB of data transfer out each month. Beyond this, the first 50 TB per Month will be$0.023 per GB. The next 450 TB per month will be$0.022 per GB. Over 500 TB the price per month will be$0.021 per GB.
Amazon S3 Infrequent Access (IA)- Best used for data used less frequently but requires the same resiliency as the standard storage class. However, this can be retrieved rapidly when needed but charged a retrieval fee of $0.01 per GB.
Amazon S3 One Zone Infrequent Access is an even less expensive option since the data is only stored in a single availability zone with less resiliency. As a result, One-Zone IA is a great option for storing secondary backups.
Amazon Glacier is designed for data that is retained for long term storage, such as backups or cold data. Two main options are bulk retrievals (that take 5–12 hours to restore) and faster retrievals, more expensive expedited retrievals that can take 1–5 minutes
It might be an option to introduce object lifecycle management that automatically transitions data between the storage classes. For instance, you can automatically move your data from S3 Standard to IA after 30 days, archive data to Glacier after 90 days or set up a delete policy to expire specific objects after 180 days.
3. Select the Right Instance Type
To maximize your workloads while minimizing your spend, consider your specific use case:
What is the type of processing unit and amount of memory required?
Optimize the instance resource that results in the delivery of price performance for the price
At least twice a year, assess your choice of instances to ensure they match the reality of your workload
Tagging your instance is not only best practice but the cost per hour of running systems can be monitored in real-time: these results can drive the development team to optimize costs. To enforce discipline on tagging using tools like open-source tools like Cloud Custodian can automatically stop any instance without a Tag.
Do you have any other tips to help reduce the cost of AWS usage? Feel free to email me - Gerry.Darley@Foxtekrs.com.
Thanks,
Gerry