Taiwan’s Foxconn, the world’s largest contract electronics manufacturer, has taken a significant leap into artificial intelligence with the launch of its first large language model, “FoxBrain.” The company announced on Monday that the model is aimed at transforming manufacturing and supply chain management through advanced AI capabilities.
FoxBrain was developed using 120 of Nvidia’s powerful H100 GPUs and completed training within just four weeks. The model is built on Meta’s Llama 3.1 architecture, positioning it as a cutting-edge tool in the AI space. Foxconn, known globally for assembling Apple’s iPhones and producing Nvidia’s AI servers, emphasized that FoxBrain is Taiwan’s first large language model with reasoning capabilities tailored specifically for traditional Chinese and Taiwanese language styles.
Despite a slight performance gap compared to China’s DeepSeek distillation model, Foxconn stated that FoxBrain’s overall performance is nearing world-class benchmarks. The company envisions the model as a cornerstone for internal operations, offering functionalities in data analysis, decision support, document collaboration, mathematics, reasoning and problem-solving, and code generation.
Initially developed for in-house applications, Foxconn plans to extend FoxBrain’s use by collaborating with technology partners. The company also intends to share open-source information and promote AI adoption across manufacturing, supply chain operations, and intelligent decision-making processes.
The development of FoxBrain received significant support from Nvidia, which provided technical consultancy and computing power via its Taiwan-based supercomputer “Taipei-1.” Located in Kaohsiung, Taipei-1 is the largest supercomputer in Taiwan and played a critical role in the training of FoxBrain.
With this move, Foxconn is not only strengthening its AI capabilities but also contributing to Taiwan’s growing presence in the global artificial intelligence landscape.