Note from DSA Editor, as we continue to see innovative ways to get get federated access to all of your data, such as cloud based data warehousing and data virtualisation, this announcement by Pure Storage adds to the options available to drive faster ties to insights.
Pure Storage, the all-flash storage platform that helps innovators build a better world with data, today introduced a data hub, the company's vision to modernize storage architecture for unstructured, data-intensive workloads. Built on Pure Storage FlashBlade™, Pure's data hub is designed to be truly data centric and enable organizations to effectively utilize today's most critical currency – data.
To innovate and survive in a business environment that is increasingly data-driven, organizations must design infrastructure with data in mind and have complete, real-time access to that data. Today's mainstream solutions were designed for the world of disk and have historically helped create silos of data. A data hub is designed to deliver, share and unify data to ultimately unlock unprecedented value.
"Organizational data silos are a universal pain point across every industry. Businesses need to realize value from data even when it's out of sight and out of mind, which is impossible without insight into the full picture," said Matt Burr, GM of FlashBlade, Pure Storage. "With a data hub, we've created a central storage system that satisfies current and future application requirements with a modern platform designed to work on customers' behalf."
Today, organizations rely on four inherently siloed analytics solutions: Data warehouse, data lake, streaming analytics and AI clusters. A data hub integrates the most important features of these four silos and unifies them into a single platform. A data hub must have four key characteristics:
High-throughput for both file and object storage. Backup and data warehouse appliances require massive throughput for mainstream, file-based workloads and cloud-native, object-based applications.
True scale-out design. The power of data lake is its native, scale-out architecture, which allows batch jobs to scale limitlessly as software -- not the user -- manages resiliency and performance.
Multi-dimensional performance. Data is unpredictable and can arrive at any speed – therefore, organizations need a platform that can process any data type with any access pattern.
Massively parallel. Within the computing industry, there has been a drastic shift from serial to parallel technologies, built to mimic the human brain, and storage must keep pace.
"For decades, the storage industry has been a laggard. Not only has storage failed to keep up with advances in networking and compute, it has become a roadblock to innovation," said Ritu Jyoti, Program Vice President, Systems Infrastructure Research Portfolio at IDC. "In the era of AI and real-time analytics, roadblocks have the potential to disrupt Fortune 500 companies within a short span of time. It is time for a paradigm shift for storage -- a new dynamic architecture, purpose-built for modern challenges."
ElementAI, a company that delivers cutting-edge AI products at scale for enterprises, has seen first-hand the need for a fundamental shift in storage architectures that puts the emphasis on data share and delivery rather than data storage alone.
"To keep pace with modern innovation, enterprises need to jumpstart modern AI initiatives -- but are often burdened with legacy data silos, in particular data lakes," said Jean-Francois Gagne, Founder and CEO of ElementAI. "To build more powerful products and results, data needs to be unified and delivered rather than simply captured and stored. A data hub is the vision for a new storage architecture designed for this evolution, purpose-built to power the next generation of AI products."
"Enterprises in every industry are working to build their future with AI," said Jim McHugh, vice president and general manager of Deep Learning Systems at NVIDIA. "While NVIDIA delivers leaps in innovation and performance to power AI, data is the fuel and the storage industry must keep pace. The data hub architecture will assist customers in modernizing their storage infrastructure to maximize the compute power required for AI."
PNY Technologies is a leader in high performance computing, graphics virtualization and deep learning technologies. Its partner network is focused on bringing AI and modern analytics to its customers.
"The biggest challenge for our customers is how to make data intelligent," said Jérôme Bélan, EMEA CEO at PNY Technologies. "Currently with data sitting in silos it isn't possible to deliver on its true value. We welcome Pure Storage's call for a modern architecture to address this issue and ensure businesses can make the most of their most important asset. As more businesses look to AI, analytics and cloud-native applications they need an architecture that can power new technologies, now and in the future. We are looking forward to working with Pure to deliver unprecedented value to our customers through a data hub."