And it’s open source, too.
{ TinyZero is a small-scale reproduction, with the $30 price going toward server costs to run the experiments. TinyZero is “only useful for very restricted types of tasks” such as countdown and multiplication tasks, he said. }
At a cost of Billions to train Llama, Grok, Chatgpt, only the largest companies could afford such AI.
Then DeepSeek did it for 6M. Which opens the door to a lot of entities.
Now, UC Berkeley reveals an LLM that costs $30.
Now, your neighbor’s 13 year-old “Bill Gates wannabe” can “experiment” with AI.
These Large ? Models are rapidly becoming commoditized.
As promised, the price of using AI is dropping such that “anyone can afford it”.

ralph
4 Likes
How much intelligence does an ant have? They increase their intelligence by their large numbers. Intelligence is an emergent property of complex systems. Size matters. The size of the network matters. Stuart Kauffman showed that the more nodes the more chaotic the network becomes. It is out of this chaos that emergent properties arise. Division of labor can increase the efficacy or the performance of the system but does not replace the efficacy of size.
This is not to underrate UC Berkeley’s work. DeepSeek opened the floodgates of AI technology. Imagine that instead of GigaGiant GPU Clusters [GGGC] you get all Tesla EVs to act like a GGGC. Elon has already proposed such an idea but not specifically about AI.
- Ant colonies
- Bee hives
- Bird flocks
- Fish schools
- Buffalo herds
- Human congregations
- Size matters
The Captain
3 Likes