Foxconn Unveils ‘FoxBrain,’ Its First Large Language Model

Mar 11, 2025
Mar 11, 2025
Foxconn Unveils ‘FoxBrain,’ Its First Large Language Model

Taiwanese tech giant Foxconn has introduced its first Large Language Model (LLM), ‘FoxBrain’, as part of its strategy to integrate AI technology into manufacturing and supply chain management. The announcement was made on Monday, as reported by Reuters.

According to Foxconn, ‘FoxBrain’ was trained over four weeks using 120 NVIDIA H100 GPUs and is based on Meta’s LLaMA 3.1 architecture. It is the first Taiwanese language model, proficient in both Traditional Chinese and Taiwanese linguistic styles.

The company acknowledged that its model slightly lags behind DeepSeek’s distillation model from China. However, Foxconn maintains that ‘FoxBrain’ is approaching the standards of world-class AI models.

Initially developed for internal use, the model demonstrates expertise in data analysis, decision support, document collaboration, mathematics, logic, problem-solving, and coding. Foxconn also plans to expand its applications in collaboration with technology partners in the future.

The model was trained with the support of NVIDIA’s Taiwan-based supercomputer, Taipei-1. Further details are expected to be announced at NVIDIA’s upcoming GTC Developer Conference.