Skip to main content
This guide covers only the simplest steps to run an LLM on your device. For a detailed, step-by-step explanation of how everything works, see the full integration guide.
1

Create a new SwiftUI project

2

Add the SDK

Add this package through SPM:
https://github.com/trymirai/uzu-swift.git
3

Get an API key

Go to Platform and follow this guide.
4

Paste the snippet

Go to ContentView.swift and add this snippet:
Don’t forget to add your API key.
import Uzu

public func runQuickStart() async throws {
    let engine = try await UzuEngine.create(apiKey: "API_KEY")

    let model = try await engine.chatModel(repoId: "Qwen/Qwen3-0.6B")
    try await engine.downloadChatModel(model) { update in
        print("Progress: \(update.progress)")
    }

    let session = try engine.chatSession(model)
    let output = try session.run(
        input: .text(text: "Tell me a short, funny story about a robot"),
        config: RunConfig()
    ) { _ in
        return true
    }

    print(output.text.original)
}
5

Add the snippet call

var body: some View {
    VStack {
        Text("On-device AI")
    }
    .onAppear() {
        Task {
            try await runQuickStart()
        }
    }
}
Now that we’ve tried the simplest snippet, let’s take a look at the step-by-step integration guide.