This guide covers only the simplest steps to run an LLM on your device. For a detailed, step-by-step explanation of how everything works, see the full integration guide.
1

Create a new SwiftUI project

2

Add the SDK

Add this package through SPM:
https://github.com/trymirai/uzu.git
3

Get an API key

Go to Platform and follow this guide.
4

Paste the snippet

Go to ContentView.swift and add this snippet:
Don’t forget to add your API key.
import Uzu

public func runExample() async throws {
    //activate
    let engine = UzuEngine()
    let status = try await engine.activate(apiKey: "API_KEY")
    guard status == .activated || status == .gracePeriodActive else {
        return
    }

    //update models list
    try await engine.updateRegistry()
    let localModelId = "Meta-Llama-3.2-1B-Instruct-float16"

    //download model
    let modelDownloadState = engine.downloadState(identifier: localModelId)
    if modelDownloadState?.phase != .downloaded {
        let handle = try engine.downloadHandle(identifier: localModelId)
        try handle.start()
        let progressStream = try handle.progress()
        while let downloadProgress = await progressStream.next() {
            print("Progress: \(downloadProgress.progress)")
        }
    }

    //create and load inference session
    let modelId: ModelId = .local(id: localModelId)
    let session = try engine.createSession(modelId)
    try session.load(
        preset: .general,
        samplingSeed: .default,
        contextLength: .default
    )

    //create input
    let messages = [
        SessionMessage(role: .system, content: "You are a helpful assistant."),
        SessionMessage(role: .user, content: "Tell me a short, funny story about a robot."),
    ]
    let input: SessionInput = .messages(messages: messages)

    //run
    let output = try session.run(
        input: input,
        tokensLimit: 256,
        samplingConfig: nil
    ) { _ in
        return true
    }
    print("Output: \(output)")
}
5

Add the snippet call

var body: some View {
    VStack {
        Text("On-device AI")
    }
    .onAppear() {
        Task {
            try await runExample()
        }
    }
}
Now that we’ve tried the simplest snippet, let’s take a look at the step-by-step integration guide.