Nvidia brings its AI computing platform to cloud data firm Snowflake

Dovie Salais

[1/4]Snowflake Chairman and CEO Frank Slootman presents a snowboard as a gift to NVIDIA CEO Jensen Huang at Snowflake Summit 2023, in Las Vegas, Nevada, U.S. June 26, 2023. Courtesy of Snowflake/Handout via REUTERS OAKLAND, California, June 26 (Reuters) – Snowflake (SNOW.N) a cloud data analytics company, is partnering with […]

OAKLAND, California, June 26 (Reuters) – Snowflake (SNOW.N) a cloud data analytics company, is partnering with computing company Nvidia (NVDA.O) to allow customers ranging from financial institutions to healthcare and retail to build AI models using their own data.

The two companies announced the partnership at Snowflake Summit 2023 on Monday.

“In the old days, in small data computing, you moved data to the computer,” Nvidia Chief Executive Jensen Huang told Reuters. “But when you have giant amounts of data like Snowflake does, and the pile of proprietary data … data that’s so valuable to a company, then you move the compute to the data.”

In this case, Nvidia is taking a “fairly engineering intensive” move of embedding its NeMo platform for training and running generative AI models into the Snowflake Data Cloud, said Huang.

The partnership comes as chatbot ChatGPT has pushed many companies to find their AI strategies and has propelled Nvidia, which provides the main hardware for AI, to becoming a trillion dollar company.

“This is significant. This is the last mile that we’ve been waiting for 40 years,” said Frank Slootman, Chairman and CEO of Snowflake. “Every industry is on this. They used to say software is eating the world. Well, now data is eating software,” he said about the importance of data today.

Slootman said companies that use Snowflake to manage their data will be able to now use their own data to train new AI models to gain an advantage in business without risking losing control of it.

No financial details of the partnership were disclosed, but Huang said Nvidia would benefit as more customers use computing for AI work.

“We sell more chips, and we have an operating system for AI called Nvidia AI Enterprise. And that operating system makes it possible for our chips to process AI,” said Huang. Nvidia charges customers for the use of its Nvidia AI Enterprise software.

Reporting By Jane Lanhee Lee
Editing by Nick Zieminski

Our Standards: The Thomson Reuters Trust Principles.

Jane Lee

Thomson Reuters

Reports on global trends in computing from covering semiconductors and tools to manufacture them to quantum computing. Has 27 years of experience reporting from South Korea, China, and the U.S. and previously worked at the Asian Wall Street Journal, Dow Jones Newswires and Reuters TV. In her free time, she studies math and physics with the goal of grasping quantum physics.

Next Post

Building a computer that solves practical problems at the speed of light

The end of Moore’s Law In 1965, the engineer (and a founder of Intel) Gordon Moore predicted that the number of transistors in an integrated circuit would double every year. He later changed his prediction to every two years, and for decades, the capacity of computers has increased at roughly […]
Building a computer that solves practical problems at the speed of light