Gemma Gains Ground: Google’s AI Model Surpasses 150 Million Downloads

Google's Gemma AI model has surpassed 150 million downloads, marking a significant milestone in the tech giant’s push into the competitive open AI model space. The news was revealed by Omar Sanseviero, Developer Relations Engineer at Google DeepMind, on X (formerly Twitter), as reported by TechCrunch.
Sanseviero further noted that developers have already created more than 70,000 variants of the Gemma model on the popular AI development platform Hugging Face.
Unveiled in February 2024, Gemma was launched by Google to compete with other “open” model families like Meta’s LLaMA. The latest versions of Gemma are capable of multimodal processing, meaning they can handle both text and images, and support over 100 languages. Google has also developed specially fine-tuned versions for sector-specific applications, such as drug discovery.
Despite the impressive figure of over 140 million downloads within a year, Gemma still trails far behind its rival LLaMA. As of late April, LLaMA had exceeded 1.2 billion downloads.
Notably, both Gemma and LLaMA are considered custom models and have drawn criticism for their licensing terms. Many developers view these licensing structures as restrictive, making commercial deployment of the models potentially risky.