The Secret to Maximizing Returns on Auto-Optimized Tiered Storage

Buurst Staff

Intelligent tiering in storage can save money, but you can additionally save up to 75% if you opt for dedupe and data compression first

Businesses deal with large volumes of data every day and continue to add to this data at a rate that’s often difficult to keep up with. Data management is a continuous challenge, and data storage is an exponential expense. While the cost of storage/GB may not be significant, it adds up quickly, over days and months, and becomes a significant chunk of ongoing expenses.

More often than not, it is not feasible to delete or erase old data. Data must be stored for various reasons such as legal compliance, building databases, machine learning, or simply because it may be needed later. But a large portion of data often goes untouched for months at a time, with no need for access, yet continues to rack up the storage disk bills.

Cutting costs with Automated Tiered Storage

As a solution to this problem faced by most business owners, many storage providers and NAS filers offer auto tiering for storage. With automated tiering, data is stored across various levels of disks to save on storage costs. This tiering means data that’s accessed less frequently is stored on disks with lower performance and higher latency—disks that are much cheaper, usually 50-60% cheaper than high performance disks.

It is often difficult to identify and isolate data that is less likely to be accessed, so policies are set in place to identify and shift data automatically. For instance, you may set the threshold at 6 weeks. Then, once 6 weeks have passed without accessing a certain block of data, that block is automatically moved down to a lower tier, where it is not as expensive to continue to store it.

So intelligent tiering of data helps reduce storage costs significantly without really impacting day-to-day operations.

The Big Guns: Dedupe and Data Compression

While tiered storage helps you save on storage costs by optimizing the storage location, the expense is still proportionate to the amount of data. After all, you do pay for storage/GB. But before you even store data, have you considered, is this data streamlined? Am I saving data that can be pared down?

Deduplication

Unnecessarily bulky data is more common that you’d expect. Every time an old file or project is pulled out from storage for updates, or to ramp up, or to make any changes, a new file is saved. So even if changes are made to only 1 or 2 MB of data, a new copy of the entire 4TB file is made and saved. Now imagine this being done with several files each day. With this replication happening over and over again, the total amount of data quickly multiplies, occupying more storage, spiking storage costs, and even affecting IOPs. This is where inline deduplication helps.

With inline deduplication, files are compared block by block for redundancies, which are then eliminated. Instead, a reference count of the copies is saved. In most cases, data is reduced by 20-30% by making inline dedupe a part of the storage efficiency process delivered by a NAS filer.

Data compression

Reducing the number of bits needed to represent the data through compression is a simple process and can be highly effective – data can be reduced by 50-75%. The extent of compression depends on the nature of the data and how compressed it is at the outset. For instance, an mp4 file is already a highly compressed format. But, in our experience, data usually offers good opportunities for reduction through compression.

By compressing data, the amount of storage space needed is reduced, and the costs associated with storage come down too.

When we combine the effect of the deduplication and compression, we find that customers reap savings of up to 90% ! If this new streamlined data is now stored using automated tiering, the savings are amplified because

1. The amount of data to be stored is reduced, thus saving on storage capacity needed across ‘hot’ and ‘cool’ tiers
2. Input/Output is reduced, leading to better performance

Data deduplication and compression explained in a use case

Let’s say we have 1 TB of actual data to store. On average, cloud SSD storage costs $0.10 per GB/month, so $100 per TB/ month.

If the data can be reduced by 80% using deduplication and compression—which is likely— effective cost per TB is just 20% of the original projection, or $20 per TB/month. Now add in auto-tiering, and that cuts the cost in half again by using a combination of SSD and lower-cost HDD, and you have a $10 /TB cost basis.

If we estimate your storage needs grow to 10 TB over time, you will pay $100 per month – that’s the amount you would have been paying for basic file storage services for 1TB of data before dedupe and data compression.

The net effect of these combined storage efficiency capabilities delivered by SoftNAS, for example, is to reduce the effective cost per GB from $0.10 per GB/month to $0.02 per GB/month by combining tiering, compression and deduplication – without sacrificing performance. With the rapidly-increasing amount of data that must be managed, who doesn’t want to cut their cloud storage bill by 500%?

Auto-Optimized Tiered Storage with SoftNAS

Buurst SoftNAS offers SmartTiers, auto-tiered storage with the added advantage of flexible operations. After deduplication and compression, data is stored in tiers – with tiering that you can configure and control, according to policies to suit your usage patterns, and optimize further as your usage evolves. Our goal is to achieve the price/performance equation that suits your business, so even when data stored away in a low-cost tier is accessed, only the particular block accessed migrates up to the hot tier, not the entire file. With a user-friendly interface, you can continue to manage the policies and thresholds, and the capacity of the tiers, and the rest of the cost-saving optimizations happen automatically and transparently.

Want to know what kind of savings you can achieve with SoftNAS SmartTiers? Try our storage cost-savings calculator.

Subscribe to Buurst Monthly Newsletter 


More from Buurst

What is Cloud NAS?
What is Cloud NAS?

  Cloud NAS (Network Attached Storage) is a popular storage choice for people looking to use cloud storage for applications, user file systems, or data...

read more
SoftNAS Dual Zone High Availability
SoftNAS Dual Zone High Availability

SoftNAS SNAP HA™ High Availability delivers a 99.999% uptime guarantee that is a low-cost, low-complexity solution that is easy to deploy and manage. A robust...

read more