Xiaomi MiLM Tested Offline – On-Device AI
In a significant stride towards advancing AI capabilities, Xiaomi has unveiled its impressive MiLM-6B language model. The unveiling marks its first appearance in both the C-Eval and CMMLU, two major AI model evaluation lists, underscoring Xiaomi’s commitment to innovation in the AI space.
MiLM-6B, a large-scale pre-trained language model, has been developed by Xiaomi with a staggering parameter size of 6.4 billion. This impressive feat paves the way for more nuanced and contextually aware interactions between humans and machines. According to the information shared on the project’s GitHub page, the model’s prowess is not limited to cloud-based usage.
Today, Xiaomi showcased MiLM’s capabilities through an offline demo test video. The video featured a crucial question: “Does AI still work away from network signals?” Xiaomi provided an emphatic answer by demonstrating the model’s offline capabilities.
The video depicted Xiaomi MiLM operating on the mobile side, harnessing local NPU arithmetic and accessing local data to generate text with remarkable speed and efficiency. This breakthrough enables high-speed text generation with minimal power consumption.
Key test results from the offline demo include:
- Text generation speed: 11.2 words per second
- CPU usage: 11.0%
- Memory usage: 2798.0MB
In some instances, the end-side large model even rivals the performance of cloud-based counterparts, illustrating Xiaomi’s commitment to delivering exceptional user experiences. The deployment of Xiaomi MiLM in offline scenarios offers the potential for enhanced privacy and data security, marking a significant step forward in the era of AI.
Xiaomi’s emphasis on user privacy and data security is paramount. The company recognizes the importance of safeguarding user information and addresses this concern through the implementation of the end-side big model. Xiaomi’s accomplishment is not just a technological triumph but a vital solution for addressing data security in the AI era.