Google’s Hugging Face deal puts ‘supercomputer’ power behind open-source AI


Google’s Hugging Face deal puts ‘supercomputer’ power behind open-source AI -Gudstory

Getting your Trinity Audio player ready...
Rate this post


Hugging Face is one of the more popular AI model repositories, storing open-source foundation models such as Meta’s Llama2 and Stability AI’s Stable Diffusion. It also has several databases for model training.

There are over 350,000 models hosted on the platform for developers to work with or upload their own models to Hugging Face, just like coders put their code on GitHub. Valued at $4.5 billion, the hugging face has helped Google, Amazon, Nvidia and others raise $235 million over the past year.

Google said Hugging Face users can start using AI app-building platform Vertex AI and the Kubernetes engine to help train and fine-tune models “in the first half of 2024.”

Google said in a statement that its partnership with Hugging Face “advances Google Cloud’s support for open-source AI ecosystem development.” Some of Google’s models are on Hugging Face, but its banner large language models like Gemini, which now powers the chatbot Bard, and the text-to-image model are not on the Imagen repository and are considered more closed source models.


No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *