One of Europe’s most prominent AI startups has released two AI models that are so tiny, they have named them after a chicken’s brain and a fly’s brain.
Multiverse Computing claims these are the world’s smallest models that are still high performing and can handle chat, speech, and even reasoning in one case.
These new tiny models are intended to be embedded into internet of things devices, as well as run locally on smartphones, tablets, and PCs.
“We can compress the model so much that they can fit on devices,” Orús told TechCrunch. “You can run them on premises, directly on your iPhone or on your Apple Watch.”
As we previously reported, Multiverse Computing is a buzzy European AI startup headquartered in Donostia, Spain, with about 100 employees in offices worldwide. It was co-founded by a top European professor of quantum computers and physics, Román Orús; quantum computing expert Samuel Mugel; and Enrique Lizaso Olmos the former deputy CEO of Unnim Banc.
It just raised €189 million (about $215 million) in June on the strength of a model compression technology it calls “CompactifAI.” (Since it was founded in 2019, it has raised about $250 million, Orús said.)
CompactifAI is a quantum-inspired compression algorithm that reduces the size of existing AI models without sacrificing those models’ performance, Orús said.
Techcrunch event
San Francisco
|
October 27-29, 2025
“We have a compression technology that is not the typical compression technology that the people from computer science or machine learning will do, because we come from quantum physics,” he described. “It’s a more subtle and more refined compression algorithm.”
The company has already released a long list of compressed versions of open-source models, especially popular small models like Llama 4 Scout or Mistral Small 3.1. And it just launched compressed versions of OpenAI’s two new open models. It has also compressed some very large models – it offers a DeepSeek R1 Slim, for instance.
But since it’s in the business of making models smaller, it has focused extra attention on making the smallest yet most powerful models possible.
Its two new models are so small that they can bring chat AI capabilities to just about any IoT device and work without an internet connection, the company says. It humorously calls this family the Model Zoo because it’s naming the products based on animal brain sizes.
A model it calls SuperFly is a compressed version of Hugging Face’s open-source model SmolLM2 135. The original has 135M parameters and was developed for on-device uses. SuperFly is 94M parameters, which Orús likens to the size of a fly’s brain. “This is like having a fly, but a little bit more clever,” he said.
SuperFly is designed to be trained on very restricted data, like a device’s operations. Multiverse envisions it embedded into home appliances, allowing users to operate them with voice commands like “start quick wash” for a washing machine. Or users can ask troubleshooting questions. With a little processing power (like an Arduino), the model can handle a voice interface, as the company showed in a live demo to TechCrunch.
The other model is named ChickBrain, and is larger at 3.2 billion parameters, but is also far more capable and has reasoning capabilities. It’s a compressed version of Meta’s Llama 3.1 8B model, Multiverse says. Yet it’s small enough to run on a MacBook, no internet connection required.
More importantly, Orús said that ChickBrain actually slightly outperforms the original in several standard benchmarks, including the language-skill benchmark MMLU-Pro, math skills benchmarks Math 500 and GSM8K, and the general knowledge benchmark GPQA Diamond.
Here are the results of Multiverse’s internal tests of ChickBrain on the benchmarks. The company didn’t offer benchmark results for SuperFly but Multiverse also isn’t targeting SuperFly at use cases that require reasoning.

It’s important to note that Multiverse isn’t claiming that its Model Zoo will beat the largest state-of-the-art models on such benchmarks. Zoo performances might not even land on the leaderboards. The point is that its tech can shrink model size without a performance hit, the company says.
Orús says the company is already in talks with all the leading device and appliance makers. “We are talking with Apple. We are talking with Samsung, also with Sony and with HP, obviously. HP came as an investor in the last round,” he said. The round was led by well-known European VC firm Bullhound Capital, with participation from a lot of others, including HP Tech Ventures and Toshiba.
The startup also offers compression tech for other forms of machine learning, like image recognition, and in six years has obtained clients like BASF, Ally, Moody’s, Bosch, and others.
In addition to selling its models directly to major device manufacturers, Multiverse offers its compressed models via an API hosted on AWS that any developer can use, often at lower token fees than competitors.