DEV Community

Programming Central
Programming Central

Posted on • Originally published at programmingcentral.hashnode.dev

Mastering Dependency Injection in SwiftUI: Sharing AI Clients with @Environment

Building a sophisticated AI-powered application in SwiftUI brings a unique set of architectural challenges. When your app relies on multiple services—like an OpenAIChatClient for LLM interactions, a CoreMLPredictor for on-device tasks, or a WhisperSTTClient for voice—how do you get those clients into the views that need them?

If you find yourself passing client instances through five layers of view initializers, you’re suffering from prop drilling. If you’ve resorted to global singletons, you’ve likely realized they make testing a nightmare and lead to messy state management.

The solution is SwiftUI’s @Environment. By leveraging implicit dependency injection, you can create a clean, decoupled architecture that aligns perfectly with SwiftUI’s declarative nature.

The Problem: Prop Drilling vs. Global Singletons

In a complex AI assistant, a root ContentView might contain a ChatView, which contains a MessageInputView, which finally contains a SuggestionBarView. If that suggestion bar needs the AI client to generate predictive text, you have two traditional (and flawed) choices:

  1. Prop Drilling: Manually passing the client through every single intermediate view. This is fragile and makes refactoring nearly impossible.
  2. Global Singletons: Using OpenAIChatClient.shared. While easy, this creates tight coupling, makes it impossible to swap in "mock" clients for unit testing, and can lead to race conditions in multi-window environments.

The Solution: Implicit Dependency Injection

SwiftUI’s @Environment property wrapper allows a view to declare exactly what it needs without worrying about where it comes from. Much like how SwiftData uses \.modelContext, we can create our own keys to inject AI services at the top of the view hierarchy and access them anywhere below.

Why this works for AI Apps:

  • Decoupling: Views focus on the UI; the environment handles the service logic.
  • Testability: You can easily inject a MockAIClient during previews or tests.
  • Concurrency Safety: By sharing a single actor-based client through the environment, you ensure thread-safe access to sensitive state like API keys and request queues.

Step-by-Step: Implementing a Custom AI Environment Key

To share an AI client, we follow a three-step pattern: define the client, create a key, and extend the environment.

1. Define the AI Client (as an Actor)

Since AI operations are asynchronous and involve mutable state (like loading statuses), using an actor is the modern standard for safety.

@available(iOS 18.0, *)
actor OpenAIChatClient: Observable {
    @Published var currentResponse: String = ""
    var isLoading: Bool = false

    func sendRequest(prompt: String) async throws {
        self.isLoading = true
        // Simulate an AI network call
        try await Task.sleep(for: .seconds(1))
        await MainActor.run {
            self.currentResponse = "AI Response to: \(prompt)"
            self.isLoading = false
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

2. Create the EnvironmentKey

The EnvironmentKey requires a defaultValue. This is what SwiftUI falls back on if you forget to provide a client.

private struct OpenAIChatClientKey: EnvironmentKey {
    static let defaultValue = OpenAIChatClient()
}

extension EnvironmentValues {
    var openAIChatClient: OpenAIChatClient {
        get { self[OpenAIChatClientKey.self] }
        set { self[OpenAIChatClientKey.self] = newValue }
    }
}
Enter fullscreen mode Exit fullscreen mode

3. Inject and Consume

Now, you provide the client at the root of your app. Every view inside ContentView (and their children) now has access to it.

@main
struct AIApp: App {
    @State private var chatClient = OpenAIChatClient()

    var body: some Scene {
        WindowGroup {
            ContentView()
                .environment(\.openAIChatClient, chatClient)
        }
    }
}

struct MessageInputView: View {
    // No initializers needed!
    @Environment(\.openAIChatClient) private var chatClient
    @State private var text = ""

    var body: some View {
        VStack {
            TextField("Ask AI...", text: $text)
            Button("Send") {
                Task { try? await chatClient.sendRequest(prompt: text) }
            }
            if chatClient.isLoading { ProgressView() }
            Text(chatClient.currentResponse)
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

The Power of Swift Concurrency and @observable

When you combine @Environment with the new @Observable macro (introduced in iOS 17/Swift 6), your UI becomes incredibly reactive.

As your AI client streams tokens back from a model, it updates its internal state. Because the client is in the environment and marked as observable, SwiftUI knows exactly which views need to re-render. This ensures that even in deep view hierarchies, your "AI is typing..." indicators and streaming text blocks update with high efficiency and zero manual plumbing.

Furthermore, by ensuring your data models (like ChatRequest or ChatResponse) conform to Sendable, the Swift compiler will guarantee that your data remains intact as it moves between your UI (MainActor) and your AI Client (Actor).

Conclusion

Using @Environment for AI clients isn't just about writing less code—it's about building a scalable, professional architecture. It eliminates the clutter of prop drilling, avoids the pitfalls of singletons, and leverages the full power of Swift’s modern concurrency model.

By treating your AI clients as environmental dependencies, you make your views more modular, your testing more robust, and your codebase ready for the complexities of modern AI integration.

Let's Discuss

  1. Have you encountered "prop drilling" pain in your SwiftUI projects, and how did you resolve it before discovering @Environment?
  2. When building AI features, do you prefer using actors for client logic, or do you stick to standard classes with manual synchronization? Let’s talk about the pros and cons in the comments!

The concepts and code demonstrated here are drawn directly from the comprehensive roadmap laid out in the ebook
SwiftUI for AI Apps. Building reactive, intelligent interfaces that respond to model outputs, stream tokens, and visualize AI predictions in real time. You can find it here: Leanpub.com or Amazon.
Check also all the other programming ebooks on python, typescript, c#, swift: Leanpub.com or Amazon.

Top comments (0)