Xiaomi has quietly stepped into the large language model space with MiMo-7B, its first publicly available open-source AI system. Built by the newly assembled Big Model Core Team, MiMo-7B focuses ...
Obsidian is already great, but my local LLM makes it better ...
Users running a quantized 7B model on a laptop expect 40+ tokens per second. A 30B MoE model on a high-end mobile device ...
Use the vitals package with ellmer to evaluate and compare the accuracy of LLMs, including writing evals to test local models.