Amazon S3 Costs: Navigating the Different Types of Storage
Amazon S3 costs can vary widely depending on how you use the different storage types. Many organizations find that they are wasting spend on underutilized and unused S3 resources. Charges hinge on storage, requests and data retrievals, data management, and data transfer.
Generally, controlling AWS storage costs requires IT admins to make a habit of:
- Cleaning up incomplete uploads.
- Deleting previous versions of objects that are not necessary.
- Reviewing your storage class transition costs.
- Reviewing data retrieval costs.
- Tracking requests made to your bucket.
- Reviewing the cost of each bucket.
- Understanding the relationship between usage and charges.
To identify where your organization can make savings, you need to understand your S3 storage tiers usage.
Amazon S3 Storage Classes
1. General Purpose
This storage is the default storage class. It offers high availability, durability, and performance object storage for data that you access frequently. S3 Standard delivers high throughput and low latency, so it is appropriate for a wide range of use cases:
- Cloud applications
- Content distribution
- Dynamic websites
- Mobile and gaming applications
- Big data analytics
You can configure this general-purpose storage at the object level. A single bucket can hold objects stored across S3 One Zone-IA, S3 Standard-IA, S3 Intelligent-Tiering, and S3 Standard. It is also possible to use S3 Lifecycle policies to transition objects between different storage classes without changing applications.
2. Unknown or Changing Access
This is the only storage class that can deliver automatic cost savings. It allows you to move objects between four access tiers whenever there is a change in data access patterns.
Amazon S3 Intelligent-Tiering optimizes costs by moving data to the most affordable access tier. It does this without incurring operational overhead or affecting performance. It stores objects in four tiers:
- Two optional archive tiers -- meant for asynchronous access with optimization for rare access.
- Two low-latency tiers -- optimized for both infrequent and frequent access.
Transitioning objects to Amazon S3 Intelligent-Tiering automatically stores them in the Frequent Access tier. This monitors your organization’s access patterns and moves objects you haven’t accessed for 30 consecutive days to the Infrequent Access tier. Whenever you access an object you’ve stored in the infrequent access tier, the system moves it back to the frequent access tier.
AWS does not charge retrieval fees for this storage class. You also do not incur any charges when moving objects between tiers within Amazon S3 Intelligent-Tiering. This storage is perfect for organizations with data sets for which there are unestablished access patterns. For example, it would be ideal for new applications.
3. Infrequent Access
S3 Standard-Infrequent Access (Standard-IA)
This storage class lets you cut storage costs for data you don’t access frequently but will need immediate access. Standard-IA gives organizations the high durability, high throughput, and low latency of S3 Standard. It inherits all S3 Standard features, including access management, security, event notifications, cross-region replication, and data life cycle policies.
S3 One Zone-Infrequent Access (S3 One Zone-IA)
S3 One Zone-IA is low-cost storage meant for re-creatable data. You can store secondary copies of your data backups or already-replicated data in other AWS regions for disaster recovery and compliance purposes. Just like Standard-AI and S3 Standard, S3 One Zone-IA inherits all existing S3 features:
- Access management
- Event notifications
- Cross-region replication
- Data lifecycle policies
It also provides the same low latency, high throughput, and high durability. This storage tier allows you to store the data you access least frequently within a single availability zone. In addition, it lowers costs by about 20 percent compared with Standard-IA Archive.
The name S3 Glacier derives from the fact this storage manages your “cold” data. These are the files you currently don’t need but might need down the road. This tier is for data that you want to store for decades with high durability but without the need to access it regularly.
S3 Glacier does not have restrictions on the type of data organizations can store. You can store anything on this tier. It stores data redundantly in multiple devices within each storage facility and multiple facilities across regions. This helps to increase its availability and durability. S3 Glacier comes with configurable retrieval times that usually range from a few minutes to several hours.
S3 Glacier Deep Archive
S3 Glacier Deep Archive is the cheapest AWS storage class. It’s for long-term storage and digital preservation of data that you rarely access (once or twice every year). This tier is perfect for organizations operating in highly regulated industries where regulations require archiving data for several years. Organizations can also use S3 Glacier Deep Archive for backup and disaster recovery.
Amazon S3 Glacier Deep Archive disperses all the data it stores across three different availability zones. Organizations can retrieve the data in about 12 hours.