$XIAOMI-W (01810.HK)$Xiaomi's MiMo punches above its weight, eclipses OpenAI and Alibaba in AI tests
1
Report
Moriarty mcG
OP
:
Xiaomi recently unveiled MiMo, its first open-source large language model (LLM) designed for reasoning tasks, announced on April 30, 2025. Here’s a detailed breakdown based on available information:What is MiMo?MiMo (likely standing for "Mindful Model" or a similar term, though not explicitly defined) is a 7-billion-parameter LLM series, trained from scratch by Xiaomi’s LLM-Core Team for mathematics, coding, and general reasoning tasks. It’s optimized through pre-training and post-training (reinforcement learning, or RL) to enhance reasoning capabilities.The MiMo-7B series includes:MiMo-7B-Base: Pre-trained on ~25 trillion tokens with a multi-token prediction objective to boost performance and inference speed.MiMo-7B-SFT: A supervised fine-tuned version.MiMo-7B-RL: RL-tuned from the base model, excelling in math and code.MiMo-7B-RL-Zero: RL-trained from a cold-start supervised fine-tuned model, achieving 93.6% on specific benchmarks.PerformanceDespite its compact size (7B parameters), MiMo outperforms larger models like OpenAI’s closed-source o1-mini and Alibaba’s Qwen-32B-Preview on key benchmarks (e.g., AIME24, AIME25, LiveCodeBench, MATH500, GPQA-Diamond). For example, MiMo-7B-RL matches o1-mini’s performance in math and coding tasks.It uses a three-stage pre-training data mixture and RL with 130K curated math/code problems, verified by rule-based systems to ensure quality. A test difficulty-driven reward system and data resampling enhance its optimization.AvailabilityMiMo is open-source, with models available on Hugging Face (https://huggingface.co/XiaomiMiMo). Xiaomi supports inference via a forked version of vLLM, though compatibility with other engines is unverified. The team welcomes contributions and feedback at mimo@xiaomi.com.The release includes checkpoints for all model variants, aiming to benefit the broader AI community by providing insights into building reasoning-focused LLMs.SignificanceMiMo marks Xiaomi’s entry into the competitive AI landscape, showcasing its ambition beyond hardware. Posts on X highlight its compact efficiency and superior performance, with sentiments praising Xiaomi’s innovation in open-source AI.Unlike proprietary models, MiMo’s open-source nature allows developers and researchers to adapt and build upon it, potentially accelerating advancements in reasoning-focused AI applications
Johnny Climber
我爱Mikasa
:
I will sell some to free up some money, but still kept a base position. Recently sold quite a bit of Xiaomi at around 48.3 to have money to buy the bubbles.
$XIAOMI-W (01810.HK)$Xiaomi’s Xring Is The Name Of The Company’s Upcoming In-House Chipset And It Finally Has A Potential Launch Date, Though Tipster Hints That It Could Be Delayed
3
Report
Moriarty mcG
OP
:
Xiaomi’s In-House Chipset’s Specification Details Come Through; Rumor Claims That SoC Will Be Mass Produced On TSMC’s 4nm ‘N4P’ But Will Stick To ARM’s Current CPU Designs
Moriarty mcG
OP
:
The XRING SoC positions Xiaomi to compete more aggressively in the premium smartphone market, with potential to reshape its role in the semiconductor industry if successful. Official details are still pending, so some specs remain speculative.
$XIAOMI-W (01810.HK)$Von wegen China-Auto: Xiaomi wird aus München "Porsche" angreifen What's Xiaomi planning? New report really kicks things into high gear Xiaomi plans to enter the European market with its new car division in the medium term – CEO Lei Jun has already publicly announced this. The first model, the SU7, has also attracted considerable attention here and is enjoying strong sales in China. But when will sales begin here and in the EU? Now, exclusive research...
$XIAOMI-W (01810.HK)$be worth? If all goes to plan what price could we be looking at? I'm thinking 65 Hkd to 70ish
3
4
Report
Frank0707
:
There are no long-term issues, but the possibility of a direct surge this time is almost nonexistent. It depends on whether to focus on the long term or the short term.
Moriarty mcG
OP
:
If they get this 3Nm chip in production then who knows. The most important component, the chip would be a fantastic accomplishment for Xiaomi. If anyone can do it they can.
$XIAOMI-W (01810.HK)$Xiaomi’s shares surged another 6% in Hong Kong trade on Wednesday after Jefferies reiterated its Buy rating on the technology giant, while also touting it as a top pick among its Chinese peers. The shares jumped 5.8% on Tuesday.
Moriarty mcG OP : Xiaomi recently unveiled MiMo, its first open-source large language model (LLM) designed for reasoning tasks, announced on April 30, 2025. Here’s a detailed breakdown based on available information:What is MiMo?MiMo (likely standing for "Mindful Model" or a similar term, though not explicitly defined) is a 7-billion-parameter LLM series, trained from scratch by Xiaomi’s LLM-Core Team for mathematics, coding, and general reasoning tasks. It’s optimized through pre-training and post-training (reinforcement learning, or RL) to enhance reasoning capabilities.The MiMo-7B series includes:MiMo-7B-Base: Pre-trained on ~25 trillion tokens with a multi-token prediction objective to boost performance and inference speed.MiMo-7B-SFT: A supervised fine-tuned version.MiMo-7B-RL: RL-tuned from the base model, excelling in math and code.MiMo-7B-RL-Zero: RL-trained from a cold-start supervised fine-tuned model, achieving 93.6% on specific benchmarks.PerformanceDespite its compact size (7B parameters), MiMo outperforms larger models like OpenAI’s closed-source o1-mini and Alibaba’s Qwen-32B-Preview on key benchmarks (e.g., AIME24, AIME25, LiveCodeBench, MATH500, GPQA-Diamond). For example, MiMo-7B-RL matches o1-mini’s performance in math and coding tasks.It uses a three-stage pre-training data mixture and RL with 130K curated math/code problems, verified by rule-based systems to ensure quality. A test difficulty-driven reward system and data resampling enhance its optimization.AvailabilityMiMo is open-source, with models available on Hugging Face (https://huggingface.co/XiaomiMiMo). Xiaomi supports inference via a forked version of vLLM, though compatibility with other engines is unverified. The team welcomes contributions and feedback at mimo@xiaomi.com.The release includes checkpoints for all model variants, aiming to benefit the broader AI community by providing insights into building reasoning-focused LLMs.SignificanceMiMo marks Xiaomi’s entry into the competitive AI landscape, showcasing its ambition beyond hardware. Posts on X highlight its compact efficiency and superior performance, with sentiments praising Xiaomi’s innovation in open-source AI.Unlike proprietary models, MiMo’s open-source nature allows developers and researchers to adapt and build upon it, potentially accelerating advancements in reasoning-focused AI applications