Mirai lets you add high-performance AI right into your app with zero latency, full data privacy, and no inference costs. You don’t need an ML team or weeks of setup anymore. One developer can get it all running in minutes. To achieve this, we offer the following key products:Documentation Index
Fetch the complete documentation index at: https://docs.trymirai.com/llms.txt
Use this file to discover all available pages before exploring further.

uzu
A Rust inference engine built to run AI with hardware specifics in mind.

lalamo
A set of tools to optimize and convert models for on-device use.

cli
A command-line tool to chat with models and serve them as a local API.
FAQ
Is this project open source?
Is this project open source?
Which devices are supported?
Which devices are supported?
Currently, only Apple Silicon (iOS/macOS) devices are supported.
Which models are supported?
Which models are supported?
The full list of supported models is available here.