Seti but for llm; How an LLM solution that is hardly a few months old can revolutionize that the inference is done


  • Exo supports Llama, Mistral, Llava, Qwen and Deepseek
  • Can run on Linux, MacOS, Android and iOS but not Windows
  • AI models that need 16 GB RAM can run on two 8 GB of laptops

Driving large language models (LLMs) typically requires expensive, high performance hardware with significant memory and GPU power. However, EXO software now seems to offer an alternative by enabling distributed artificial intelligence (AI) inferens across a network of devices.

The company allows users to combine computing power on multiple computers, smartphones and even single-table computers (SBCs) such as Raspberry Pi’s to run models that would otherwise be inaccessible.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top