DEV Community

Programming Central
Programming Central

Posted on • Originally published at programmingcentral.hashnode.dev

Swift 6 Concurrency: How to Build Rock-Solid AI Apps with Sendable and Actors

Modern AI applications are a concurrency nightmare. Think about it: you have an LLM performing heavy inference on a background thread, streaming tokens in real-time, while your SwiftUI interface needs to stay buttery smooth. One wrong move and you’re staring at a "data race"—those insidious, hard-to-debug crashes that happen when two threads fight over the same piece of data.

With the release of Swift 6, Apple has fundamentally changed the game. By moving concurrency safety from a runtime "hope for the best" approach to a compile-time guarantee, Swift 6 allows us to build reactive, intelligent interfaces that are mathematically proven to be thread-safe.

In this post, we’ll dive into the four pillars of modern Swift concurrency—Sendable, actors, async/await, and @Observable—and see how they work together to power the next generation of AI apps.

The Problem: Why AI Apps Break

AI apps aren't static. When you’re streaming a response from a model like GPT-4 or Llama 3, your data model is constantly mutating. If your UI thread tries to read the messages array while a background task is appending a new token, the app crashes.

Before Swift 6, we relied on manual locks or dispatch queues. Today, we have a more elegant framework.

1. Sendable: The Gatekeeper of Data Safety

At the heart of this revolution is the Sendable protocol. It’s a marker that tells the compiler: "This data is safe to pass across threads."

  • Value Types (Structs/Enums): These are the heroes of Swift 6. Because they are copied when passed around, they are implicitly Sendable.
  • Reference Types (Classes): These are dangerous. To make a class Sendable, it must be immutable or internally synchronized.

In an AI context, Sendable ensures that the String tokens coming off your model can safely travel from the background inference engine to your UI data model without causing a collision.

2. Actors: Protecting the Source of Truth

While Sendable handles the data in transit, actors handle the data at rest.

An actor is a reference type that isolates its state. It ensures that only one task can access its properties at a time. If you have a ChatConversation actor, and five different background tasks try to append tokens simultaneously, the actor serializes them. No data races, no corruption.

3. @observable: Fine-Grained UI Updates

Swift 5.9 and 6 introduced the @Observable macro, replacing the older ObservableObject. For AI apps, this is a performance lifesaver.

Instead of re-rendering your entire chat screen every time a new token arrives, @Observable allows SwiftUI to track exactly which property changed. If only the last message was updated, only that specific part of the UI refreshes. This is the secret to maintaining 60 FPS while an AI is "typing" at high speeds.


Putting It All Together: The Chat Data Model

Here is how you design a thread-safe chat model in Swift 6. This architecture uses an actor to manage the "source of truth" and an @Observable class to bridge that data to the SwiftUI main thread.

import Foundation
import SwiftUI

// 1. Define Sendable data structures
public enum MessageRole: Sendable {
    case user, assistant
}

public struct ChatMessage: Sendable, Identifiable {
    public let id: UUID = UUID()
    public let role: MessageRole
    public var content: String
}

// 2. Create an Observable state for the UI
@MainActor @Observable
public class ChatUIState {
    public var messages: [ChatMessage] = []
    public var isGenerating: Bool = false
}

// 3. Use an Actor to manage background mutations safely
public actor ChatManager {
    private let uiState: ChatUIState
    private var internalMessages: [ChatMessage] = []

    public init(uiState: ChatUIState) {
        self.uiState = uiState
    }

    public func streamToken(_ token: String, to messageID: UUID) async {
        // Update internal state
        if let index = internalMessages.firstIndex(where: { $0.id == messageID }) {
            internalMessages[index].content += token

            // Sync to UI on the MainActor
            let updatedMessage = internalMessages[index]
            await MainActor.run {
                if let uiIndex = uiState.messages.firstIndex(where: { $0.id == messageID }) {
                    uiState.messages[uiIndex] = updatedMessage
                }
            }
        }
    }

    public func addNewMessage(role: MessageRole, content: String) async {
        let newMessage = ChatMessage(role: role, content: content)
        internalMessages.append(newMessage)

        await MainActor.run {
            uiState.messages.append(newMessage)
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Why This Design Wins

This architecture follows Apple's best practices for several reasons:

  1. Compile-Time Safety: The compiler will literally refuse to build the app if you try to modify internalMessages from a thread that doesn't own the actor.
  2. Responsiveness: By using async/await, we ensure the main thread is never blocked. The UI remains interactive even while the actor is processing a heavy stream of AI data.
  3. Efficiency: @Observable ensures that SwiftUI only updates the specific message bubble receiving the tokens, saving battery life and CPU cycles.

Conclusion

Swift 6 isn't just an incremental update; it’s a paradigm shift for developers building high-concurrency applications. By embracing Sendable types and actors, you stop fighting the compiler and start letting it help you build safer, faster AI experiences.

Let's Discuss

  1. Have you started migrating your projects to Swift 6's strict concurrency mode yet? What has been your biggest "aha!" moment or frustration?
  2. When building AI interfaces, do you prefer using actors for state management, or are you still relying on traditional dispatch queues? Why?

The concepts and code demonstrated here are drawn directly from the comprehensive roadmap laid out in the ebook
SwiftUI for AI Apps. Building reactive, intelligent interfaces that respond to model outputs, stream tokens, and visualize AI predictions in real time. You can find it here: Leanpub.com or Amazon.

Swift & AI Masterclass:
Book 1: Core ML & Vision Framework.
Book 2: Apple Intelligence & Foundation Models.
Book 3: Natural Language & Speech.
Book 4: SwiftUI for AI Apps.
Book 5: Create ML Studio.
Book 6: MLX Swift & Local LLMs.
Book 7: visionOS & Spatial AI.
Book 8: Swift + OpenAI & LangChain.
Book 9: CoreData, CloudKit & Vector Search.
Book 10: Shipping AI Apps to the App Store.

Check also all the other programming & AI ebooks on python, typescript, c#, swift, kotlin: Leanpub.com or Amazon.

Top comments (0)