Rakuten Unveils 700B Parameter Open-Source Japanese LLM
- •Rakuten releases a Japanese-specialized LLM with 700 billion parameters, one of the largest in Japan.
- •The model adopts a Mixture of Experts (MoE) architecture to balance advanced reasoning with efficient processing.
- •Available under the Apache 2.0 license through the Ministry of Economy, Trade and Industry’s GENIAC project.
Rakuten Group has launched its latest large language model, Rakuten AI 3.0, aiming to establish a new standard for AI utilization in Japanese-speaking environments. Developed as part of the third phase of GENIAC—a generative AI development support project led by the Ministry of Economy, Trade and Industry (METI) and the New Energy and Industrial Technology Development Organization (NEDO)—this model features approximately 700 billion parameters, making it one of the largest in the country. Following an initial announcement in December 2025, the model underwent extensive fine-tuning to optimize performance for specific tasks. As a result, it has achieved world-class performance across various metrics, including logical reasoning, competitive mathematics, and instruction following, often rivaling or even surpassing leading global models.
At its technical core is the Mixture of Experts (MoE) architecture, which efficiently switches between multiple specialized neural networks during processing. This approach allows the model to maintain an immense knowledge base while dynamically allocating computational resources, effectively optimizing the heavy processing loads typically associated with large-scale models. By fusing Rakuten’s vast accumulation of proprietary bilingual data with the latest insights from the open-source community, the model achieves high precision for professional applications. It demonstrates a deep understanding of Japanese linguistic nuances and cultural contexts, enabling it to excel at complex programming, sophisticated document analysis, and natural text generation.
The most significant strategic move is the decision to release the model as an open-source asset under the Apache 2.0 license. By making the model weights publicly available through Rakuten’s official repository, the company empowers developers and businesses to build secure AI applications on their own infrastructure. This initiative reflects Rakuten’s commitment to "AI-nization"—a company-wide AI transformation—by sharing its technological breakthroughs to accelerate innovation across the domestic AI ecosystem. This release represents more than just a product update; it is a vital step toward redefining Japan’s competitive edge in the global AI landscape.