Looking for an instant start? Check out the drop-in snippets:Otherwise, check the detailed integration guide.

uzu
A Rust inference engine built to run AI with hardware specifics in mind.

uzu-swift
Prebuilt bindings to run uzu from Swift.

uzu-ts
Prebuilt bindings to run uzu from TypeScript.

lalamo
A set of tools to optimize and convert models for on-device use.

CLI
A command-line tool to chat with models and serve them as a local API.

Platform
A console where you can create your organization and grab an API key for SDK use.
Where to start?
I want to add AI to my app
I want to add AI to my app
If you just need a drop-in solution to run one of the supported models, you’ll need a binding framework that matches your application stack. Check our guide for step-by-step integration and an overview of the key concepts.
I want to add support for a new model
I want to add support for a new model
Check out the overview of the lalamo models toolkit.
I want to add an advanced feature to the inference engine
I want to add an advanced feature to the inference engine
Check out the overview of the uzu inference engine.
FAQ
Is this project open source?
Is this project open source?
Is it free to use?
Is it free to use?
Which devices are supported?
Which devices are supported?
Currently, we only support Apple (iOS/macOS) devices.
Which models are supported?
Which models are supported?
The full list of supported models is available here.