Artificial intelligence isn’t just about algorithms. The data on which they’re trained is more important than the models themselves, some would argue, which is one of the reasons IDC predicts that more than 44 zettabytes of digital data will be created by 2020. Thankfully, the rise of big data has coincided with a continued decline in cloud storage pricing, motivated in part by cheaper media costs, better management tools, and innovations in object storage.

But not all cloud storage providers are created equal. Some lack the fine-grain management tools required to collate, process, and transfer AI model data quickly and efficiently. And not all enterprises have storage stacks optimized for data science workflows.

Nvidia and data storage company NetApp today jointly announced what they believe is a solution: Ontap AI, which they describe as an “AI-proven architecture.” Powered by Nvidia’s DGX supercomputers and NetApp’s AFF A800 cloud-connected flash storage, it’s designed to help organizations achieve “edge to core to cloud” control over their data by delivering unprecedented access and performance, said Octavian Tanase, senior vice president at NetApp.

“Our unique vision of a data pipeline [affords] simplicity of deployment,” he told VentureBeat in a phone interview. “People are looking for scale — they want to start small and grow. At the end of the day, we want customers to be able to manage data across the edge … correlate datasets, build large data lakes, [and] ultimately make faster decisions and better decisions [about] data.”

The connective tissue that ties…

[SOURCE]