Optimizing I/O for Data‑Intensive Analytics in Cloud‑Native Environments: Insights from Uber Presto
This whitepaper analyzes the shift of data‑intensive analytics to cloud‑native platforms, examines Uber Presto’s fragmented I/O patterns, reveals hidden storage‑API cost impacts, and proposes cloud‑aware I/O optimization strategies to improve performance‑cost efficiency.
The article explores the industry trend of migrating data‑intensive analytics applications from on‑premises to cloud‑native environments, emphasizing that the unique cost model of cloud storage demands a more detailed understanding of performance optimization.
Through an empirical study of Uber’s Presto production workload, it discovers that over 50% of data accesses are smaller than 10 KB and more than 90% are under 1 MB, indicating a highly fragmented access pattern whose financial implications differ markedly from traditional data platforms.
The case study demonstrates that conventional I/O optimizations, which ignore the financial cost of storage API calls, can result in unexpectedly high expenses when deployed in cloud settings.
Based on these observations, the paper presents logical I/O‑optimization techniques and strategies specifically designed for cloud environments, aiming to help architects design efficient I/O solutions that significantly improve cost‑performance for data‑intensive applications in the cloud.
DataFunTalk
Dedicated to sharing and discussing big data and AI technology applications, aiming to empower a million data scientists. Regularly hosts live tech talks and curates articles on big data, recommendation/search algorithms, advertising algorithms, NLP, intelligent risk control, autonomous driving, and machine learning/deep learning.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.