Originally published on Medium. Cross-posted here for the Dev.to community.
I have a confession. I used to spend an embarrassing amount of time scrolling through wallpaper websites.
You know the routine. You unlock your phone forty times a day, and every single time you see the same stock gradient Apple shipped with iOS 17. So you open Safari, type "cool iPhone wallpapers," and enter a black hole of ad-infested galleries, watermarked images, and wallpapers that looked great as thumbnails but terrible at full resolution.
One night, after closing the fourth tab of a sketchy wallpaper site, I thought: I know exactly what I want my wallpaper to look like. I just can't find it because it doesn't exist yet.
And then it hit me. With text-to-image AI, it doesn't need to exist yet. I can describe it and have it created on the spot.
That night, I built WallCraft AI. An iOS app that turns text descriptions into unique, high-resolution phone wallpapers. And I learned more about shipping a product in one night than I did in months of planning other projects.
The Idea: Why AI + Wallpapers Is a Real Product
The wallpaper market is massive and ancient. Zedge has been around since 2003. There are thousands of wallpaper apps on the App Store. But they all share the same fundamental problem: they serve you a finite catalog of images that someone else made.
AI flips this completely. Instead of browsing, you describe. Instead of settling, you generate. The wallpaper doesn't exist until you ask for it, which means every single one is unique. No one else on the planet has your wallpaper.
This isn't a solution looking for a problem. People already want custom wallpapers -- they just don't have a fast way to make them. The typical alternative is opening Midjourney, generating an image, cropping it to phone dimensions, AirDropping it, then setting it manually. Five steps for something that should be one.
I wanted to build the one-step version: type what you want, tap generate, set as wallpaper. Done.
If you just want to try it: WallCraft AI on the App Store. If you want the full story, keep reading.
Architecture Decisions: Keeping It Stupid Simple
Before writing a single line of code, I made three decisions that saved me hours.
SwiftUI, Not UIKit
I know UIKit. I've shipped production apps with it. But for a new project in 2026, SwiftUI is the obvious choice. The declarative syntax lets me iterate on UI faster, the @Observable macro (iOS 17) eliminates the boilerplate of ObservableObject and @Published, and features like NavigationStack and .sheet() make navigation trivial.
The entire app is pure SwiftUI. No UIViewRepresentable hacks, no UIKit bridges.
SwiftData, Not Core Data
Wallpapers need to be persisted locally. Users expect their generated wallpapers to survive app restarts. I chose SwiftData because it integrates natively with SwiftUI, uses the @Model macro for zero-boilerplate persistence, and handles large binary data with a single attribute:
@Model
final class WallpaperEntity {
var id: UUID
var prompt: String
@Attribute(.externalStorage) var imageData: Data
@Attribute(.externalStorage) var thumbnailData: Data?
var style: String?
var createdAt: Date
var isFavorite: Bool
}
That @Attribute(.externalStorage) annotation tells SwiftData to store image data as separate files on disk instead of bloating the SQLite database. Without it, performance tanks once a user has 50+ wallpapers saved.
StoreKit 2, Not the Legacy API
If you've ever implemented in-app purchases with the original StoreKit, you know the pain: receipt validation servers, transaction queues, opaque error codes. StoreKit 2 replaces all of that with clean Swift concurrency APIs. No server needed. Verification is built in.
Zero External Dependencies
The entire project has zero third-party packages. No Alamofire, no SDWebImage, no Kingfisher. URLSession handles networking. UIImage handles image processing. SwiftData handles persistence. This isn't dogma -- it's pragmatism. Every dependency is a build time cost, a potential breaking change, and a supply chain risk. For a small app, the standard library is enough.
The full project structure:
WallCraftAI/
├── App/
│ └── WallCraftAIApp.swift
├── Views/
│ ├── GenerateView.swift
│ ├── GalleryView.swift
│ ├── PaywallView.swift
│ └── OnboardingView.swift
├── ViewModels/
│ ├── GenerateViewModel.swift
│ └── GalleryViewModel.swift
├── Services/
│ ├── AIImageService.swift
│ ├── StoreKitManager.swift
│ └── WallpaperStorage.swift
└── Models/
├── Wallpaper.swift
└── Style.swift
MVVM, clean separation, nothing clever. The entire app is roughly 2,000 lines of Swift.
The AI Engine: Advanced Image Generation
This is where the magic happens. The core of WallCraft AI is a single API call to an advanced AI image generation endpoint.
The API Call
The service is surprisingly simple. One class, one method, one endpoint:
@Observable
final class AIImageService {
private let session: URLSession
private let endpoint: URL // AI image generation API endpoint
func generateWallpaper(prompt: String, style: String? = nil) async throws -> Data {
let fullPrompt = buildPrompt(base: prompt, style: style)
let requestBody: [String: Any] = [
"prompt": fullPrompt,
"n": 1,
"size": "1024x1536",
"quality": "high"
]
var request = URLRequest(url: endpoint)
request.httpMethod = "POST"
request.setValue("Bearer \(apiKey)", forHTTPHeaderField: "Authorization")
request.setValue("application/json", forHTTPHeaderField: "Content-Type")
request.timeoutInterval = 120
request.httpBody = try JSONSerialization.data(withJSONObject: requestBody)
let (data, response) = try await session.data(for: request)
// ... parse base64 image from response
}
}
Size: 1024x1536. Portrait aspect ratio that fits modern iPhones perfectly.
Quality: "high". The difference between standard and high is significant on a phone screen you hold six inches from your face.
Timeout: 120 seconds. AI image generation typically takes 10-30 seconds, but under heavy load it can stretch longer.
The Style System: 15 Flavors of Prompt Engineering
Here's the feature that makes WallCraft AI more than just a text box connected to an API. Users choose from 15 artistic styles, and each style modifies the prompt with carefully crafted suffixes:
enum Style: String, CaseIterable, Identifiable, Codable {
case none, anime, minimal, nature, abstract, cyberpunk,
watercolor, oilPainting, photography, render3d,
fantasy, neon, vintage, geometric, gradient
var promptSuffix: String {
switch self {
case .none:
""
case .cyberpunk:
", cyberpunk aesthetic, neon lights, dark urban environment, futuristic, rain-soaked streets"
case .watercolor:
", watercolor painting, soft washes, blended edges, paper texture, delicate tones"
case .anime:
", anime art style, vibrant colors, detailed cel shading, Studio Ghibli inspired"
// ... 11 more styles
}
}
}
I spent more time tweaking these prompt suffixes than writing the actual networking code. The difference between "watercolor style" and "watercolor painting, soft washes, blended edges, paper texture, delicate tones" is enormous in output quality.
Monetization: Freemium Done Right with StoreKit 2
The business model is simple: 1 free generation per day, unlimited with Pro.
| Tier | Price | Why It Exists |
|---|---|---|
| Weekly | $4.99 | Low commitment trial |
| Monthly | $9.99 | The standard tier |
| Yearly | $49.99 | Best value (save 58%) |
The entire in-app purchase system fits in one file:
@Observable
final class StoreKitManager {
static let freeDailyLimit = 1
var products: [Product] = []
var isProUser: Bool = false
var canGenerate: Bool {
isProUser || dailyGenerationsUsed < Self.freeDailyLimit
}
@discardableResult
func purchase(_ product: Product) async throws -> Transaction? {
let result = try await product.purchase()
switch result {
case .success(let verification):
let transaction = try checkVerified(verification)
await transaction.finish()
await checkSubscriptionStatus()
return transaction
case .userCancelled, .pending:
return nil
@unknown default:
return nil
}
}
}
No server needed. StoreKit 2 handles cryptographic verification on-device.
Five Lessons I Learned
1. Ship Fast, Iterate Later
The hardest feature to build is the "stop" button. A shipped app with four features beats an unshipped app with twenty.
2. AI Makes Content Products Viable for Solo Devs
I didn't create a single wallpaper. The entire content library is generated by users, on demand, via an API call. This is what AI unlocks for indie developers.
3. Prompt Engineering IS the Product
The difference between a mediocre AI wrapper and a good product is prompt engineering. The wrapper prompt, the style suffixes, the quality parameter -- they all exist to ensure that first wallpaper is a jaw-dropper.
4. Zero Dependencies = Zero Surprises
No pod install failures. No breaking changes from a minor version bump. The standard library is powerful enough for most indie apps.
5. iOS First for Subscription Products
iOS generates roughly twice the App Store revenue that Google Play does. For a subscription app, the platform where people actually pay for software is the one to launch on.
Try It
WallCraft AI is available on the App Store. Download it, type something wild, and see what the AI creates for you. If you like it, a review goes a long way for an indie developer.
And if you're a developer thinking about building with AI APIs -- just do it. The tools are ready. The APIs are stable. The hard part isn't the technology. It's deciding to ship.
Andy Garcia is a freelance developer based in France, building iOS apps and automation tools. You can find WallCraft AI on the App Store.
Top comments (0)