Tiiny AI unveils a pocket-sized Mini PC that runs 120B LLMs locally
Tiiny AI, a US-based startup, has announced the Tiiny AI Pocket Lab, which it claims is the worldβs smallest personal AI supercomputer. The device weighs just 300 grams, fits in one hand, and delivers up to 190 TOPS of AI compute performance. It has been officially certified by Guinness World Records as the βSmallest MiniPC (100B LLM Locally)β and can run up to 120-billion-parameter large language models (LLMs) entirely on-device.

The Pocket Lab features a 12-core ARMv9.2 CPU paired with a custom-designed NPU, achieving an AI compute performance of approximately 190 TOPS. It includes 80GB of LPDDR5X memory and a 1TB SSD to support large-scale inference workloads. Tiiny AI has built the device to operate within a 30W TDP and a typical 65W system power envelope, offering high performance with reduced energy consumption.
Tiiny AI says the Pocket Lab addresses the growing concerns around cloud dependency, data privacy, and rising energy costs. The company believes local AI processing offers a more sustainable and secure alternative to cloud-based systems. The device stores all user data and model interactions locally with bank-level encryption, eliminating the need for internet access or external servers.

The Pocket Lab supports one-click deployment of several open-source LLMs and agent frameworks. These include GPT-OSS, Llama, Mistral, DeepSeek, Qwen, and Phi, along with automation tools like ComfyUI, SillyTavern, and Flowise. The system runs entirely offline and enables use cases like multi-step reasoning, content generation, deep context understanding, and secure personal memory.
The company plans to showcase the Tiiny AI Pocket Lab at CES 2026. It has positioned the product for developers, researchers, professionals, and students who require portable and private AI capabilities. The device aims to serve real-world use cases in the 10B to 100B parameter range, which the company says accounts for over 80% of practical AI demands.
In related news, Xiaomi is reportedly developing a new AI assistant called Mi Chat, and reports indicate that GPUs are no longer the main barrier to AI progress, with other system-level constraints now becoming the limiting factor.
For more daily updates, please visit ourΒ News Section.
Stay ahead in tech! Join our Telegram community and sign up for our daily newsletter of top stories! ![]()
The post Tiiny AI unveils a pocket-sized Mini PC that runs 120B LLMs locally appeared first on Gizmochina.