It is estimated that over 2,500PB of data is created every day and it will only continue to accelerate. As organisations turn to public clouds for their compute infrastructure they naturally consume the cloud storage offerings too as they are immediately available and have incredible scale.
However, is this the most cost-effective approach? Cloud storage pricing typically has several components beyond just the capacity consumed, so calculating the total cost of ownership is not always straightforward. Public clouds were popularised on the premise of scale up and scale down, but for storage, this typically isn’t a reality, with data sets continuing to grow over time and rarely do they shrink.
What if you could deploy an open-source storage system in a location near a public cloud’s data center and achieve significant cost savings over the lifetime of an important dataset? While still maintaining low latency access to that data from the public cloud and freeing you from costly bandwidth fees if you want to access your data from outside the public cloud.
Through this whitepaper you will understand the total cost of storing large data sets in public clouds, and explore an alternative solution using a cloud adjacent storage system to drive cost efficiency.
Key takeaways from the whitepaper:
- Challenges of storage cost management in public clouds
- What is a cloud-adjacent storage system
- Accessing cloud adjacent storage from multiple public clouds
- A practical application of this approach
- Analysis of the TCO in both scenarios