On March 16, according to official news, Bittensor subnet Templar (SN3) completed the largest decentralized LLM pre-training in history, Covenant-72B, on March 10.
Covenant-72B is a language model with 72 billion parameters, pre-trained on Bittensor subnet 3 by the Templar team, entirely based on the general internet and requiring no centralized data center. The model achieved a score of 67.1 on the MMLU (zero-shot) test, outperforming centralized baseline models such as LLaMA-2-70B and LLM360 K2 under the same evaluation conditions. It is the largest fully permissionless collaborative language model to date, with over 70 different nodes contributing computational resources throughout its operation. The team has released all weights and checkpoints under the Apache License.
Possibly influenced by this news, Bittensor (TAO) and its subnet tokens generally rose, with TAO up 54.8% in the past two weeks. Subnet token τemplar has risen 194% in the past 7 days and is currently trading at $19.3.





