DEV Community

Cover image for From web to iOS in 30 days: Expo + AI-assisted code conversion
Caden Burleson
Caden Burleson

Posted on

From web to iOS in 30 days: Expo + AI-assisted code conversion

From web to iOS in 30 days: Expo + AI-assisted code conversion

Last month I shipped Fitnit's iOS app. Fitnit's a fitness app that uses your phone camera to count exercise reps with AI and reads meal photos for macros. The web version had 4,779 users when I made the call to ship native.

I didn't write a Swift app from scratch. I used Expo + React Native for the shell, AI to translate most of my existing JS into TypeScript that runs under React Native, and wrote selective Swift native modules only for the performance-critical parts. Took ~30 days end to end, mostly solo.

This is a different story than "pure native Swift rewrite." Here's what actually happened.

The architecture decision: pure Swift vs Capacitor vs Expo

When I started, I had three options. Each had different math.

Pure Swift / SwiftUI from scratch. The "ideologically clean" path. Best performance, best App Store-citizen status, fully native UX. Also: a complete rewrite of the entire client, in a language I don't ship daily, with zero code reuse from the web version. Solo founder math says no.

Capacitor (web app in a native shell). Cheapest path — you literally wrap your existing web app in a WebView. But Fitnit's core loop is "camera → 30fps pose detection → form feedback in real time." That's the case where Capacitor's bridge becomes the bottleneck. Tried it for a day, confirmed the performance gap was unworkable, moved on.

Expo + React Native + selective native modules. The middle path. React Native for the shell + most app logic; Expo's tooling (EAS Build, EAS Submit, expo-router, expo-camera) for the iOS-specific glue; custom Swift native modules via the Expo Modules API for the pieces where performance actually matters.

Picked option 3. Tradeoff accepted: not as performant as pure Swift, not as cheap as Capacitor, but the only path where one person can ship in 30 days while reusing the bulk of the existing codebase.

The AI-assisted conversion

The web app is vanilla JS + Vite. To go to React Native, every component, every hook, every piece of business logic needed to either translate cleanly to React Native or be rewritten.

I used Claude Code for the bulk of the translation (Codex or Cursor work too — pick the tool you're already comfortable with; the technique is what matters). The workflow was:

  1. Paste an entire web component into the AI, ask for "React Native equivalent using Expo Router + StyleSheet."
  2. The AI returns a first draft that's ~80% correct.
  3. Manually fix the 20%: navigation patterns, gestures, anything platform-specific the AI guessed wrong about.
  4. Move to the next component.

What AI translation handled well:

  • Component structure (functional components, hooks, props)
  • State management logic (the AI is fine porting useState / useEffect / useMemo)
  • All the pure business logic: rep-counting state machines, calorie math, macro target calculations — these translated to TypeScript verbatim
  • Supabase client calls (same SDK works on RN with one config tweak)

What AI translation handled badly:

  • Navigation. Web routing → Expo Router is a different mental model. Had to do that manually.
  • Anything camera-related. The AI tried to invent web-shaped APIs that don't exist on RN. Had to scrap and rewrite using expo-camera and the native module bridge.
  • Conditional rendering for tiny screens. The AI didn't account for one-handed iPhone usage; spacing needed manual passes.

Net: the AI saved me maybe 40-50% of the conversion time. Not magic, but real. And it was much more accurate than I expected for the pure-logic translations.

What carried over (more than I expected)

Everything backend. Supabase doesn't care which client is hitting it — same database, same auth, same edge functions. The iOS app uses the same users, workouts, and nutrition_entries tables the web app does. A user signing up on iOS logs into the web app and sees their workout history seamlessly. Single biggest reason to use Supabase over Firebase a year ago — knowing this day was coming.

The data model. Rep counts, form scores, nutrition entries, macros — same shape on both clients. No migration, no API translation layer.

All the pure logic. Calorie math, macro targets, the rep-counting state machine logic, the form-score heuristics — all of it stayed as TypeScript and ran inside React Native unchanged. This is the big payoff of choosing Expo over Swift-from-scratch. With pure Swift I would have rewritten ~5,000 lines of business logic in a new language. With Expo I rewrote ~0.

The pSEO + marketing site. That stays vanilla JS + Vite + Cloudflare Pages. The iOS app doesn't ship that part of the codebase. The landing page at /ios just deep-links to the App Store.

The brand. Same colors, same name, same positioning ("AI rep counting + photo nutrition tracking, all in one app"). Switching between web and iOS should feel like Spotify on desktop vs phone — not like two different products.

What I rebuilt natively (Swift, via Expo Modules API)

The whole appeal of Expo for an AI fitness app is that you can drop down to native Swift when the bridge cost is too high — without leaving the React Native world. The Expo Modules API makes this surprisingly approachable.

I wrote native Swift modules for two things:

1. Pose detection (the highest-leverage native module)

Web version uses MediaPipe Pose. On iOS, going through MediaPipe-via-RN was possible but added latency. Going direct to Apple's Vision framework via a native module was faster and let me use APIs that don't have RN wrappers.

The module shape:

// modules/pose-detection/ios/PoseDetectionModule.swift
import ExpoModulesCore
import Vision
import UIKit

public class PoseDetectionModule: Module {
  public func definition() -> ModuleDefinition {
    Name("PoseDetection")

    AsyncFunction("detectFromBuffer") { (imageData: Data) -> [String: Any] in
      let handler = VNImageRequestHandler(data: imageData, options: [:])
      let request = VNDetectHumanBodyPoseRequest()
      try handler.perform([request])

      guard let observation = request.results?.first else {
        return ["points": [], "confidence": 0]
      }
      return PoseSerializer.serialize(observation)
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

And the React Native side:

// app/components/RepCounter.tsx
import PoseDetection from '../../modules/pose-detection';

async function processFrame(frameData: Uint8Array) {
  const result = await PoseDetection.detectFromBuffer(frameData);
  return updateRepStateMachine(result.points);  // ← pure TS, same as web
}
Enter fullscreen mode Exit fullscreen mode

Note what's happening: the state machine (the part that decides "this is a rep" by watching elbow angle + shoulder Y over time) runs in plain TypeScript. The same logic, more or less, that ran in the web app. Only the frame-to-keypoints inference crosses the bridge to Swift.

That split is the whole game. Native where speed matters; JS/TS for everything else.

2. HealthKit bridge

Apple Health integration needed a small Expo native module too — read/write workouts and nutrition data into HKHealthStore.

The Swift side is ~80 lines:

import ExpoModulesCore
import HealthKit

public class HealthKitModule: Module {
  let store = HKHealthStore()

  public func definition() -> ModuleDefinition {
    Name("HealthKit")

    AsyncFunction("requestPermissions") { () -> Bool in
      let types: Set<HKSampleType> = [
        HKObjectType.workoutType(),
        HKQuantityType.quantityType(forIdentifier: .dietaryEnergyConsumed)!,
        // ... etc
      ]
      try await store.requestAuthorization(toShare: types, read: types)
      return true
    }

    AsyncFunction("saveWorkout") { (params: WorkoutParams) -> Void in
      let workout = HKWorkout(
        activityType: params.activityType.hkType,
        start: params.startedAt,
        end: params.endedAt,
        duration: params.duration,
        totalEnergyBurned: HKQuantity(unit: .kilocalorie(), doubleValue: params.calories),
        totalDistance: nil,
        metadata: ["FitnitExerciseType": params.exerciseType]
      )
      try await store.save(workout)
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Total native iOS code in the entire Fitnit project: ~300 lines across two modules. Everything else is TypeScript/React Native.

EAS Build + EAS Submit (the deployment story)

The other thing Expo gets you that's underrated: the deployment tooling. EAS Build handles iOS signing, archive, and binary creation in the cloud. EAS Submit pushes that binary to App Store Connect.

Effective dev loop:

eas build --platform ios --profile production
# ... ~15 min later, .ipa file is built and signed in the cloud ...
eas submit --platform ios --latest
# ... uploaded to App Store Connect, ready for review ...
Enter fullscreen mode Exit fullscreen mode

No Xcode dance. No fiddling with provisioning profiles in a UI. No "what version of Xcode does this require." For a solo founder who would rather spend that time on product, EAS pays for itself within the first build.

App Store submission: the actual experience

You hear horror stories. Mine wasn't one but it wasn't trivial either.

Round 1 rejected — guideline 2.1 (App Completeness): they wanted more screenshots showing core functionality. Easy fix, recaptured + framed the additional screenshots, resubmitted.

Round 2 rejected — guideline 5.1.1 (Data Collection and Storage): they wanted privacy clarifications around camera usage. The privacy policy on fitnitapp.com/privacy already explained that camera frames stay on-device, but the App Store privacy nutrition labels needed updating to match. Fix: tightened the labels in App Store Connect, re-attached the existing privacy policy URL, resubmitted.

Round 3 approved. [Adjust the round count + timing to match your actual experience.]

Useful tip I wish I'd had: prepare the App Store Connect listing BEFORE submitting the binary. Screenshots, descriptions, keywords, pricing, privacy labels — get all of those right first, then submit. Reviewers seem more lenient with apps that have a polished listing.

What I'd do differently

1. Ship to TestFlight on day 5, not day 25. I held off on TestFlight because I wanted the iOS experience to be feature-complete before letting anyone touch it. Mistake — getting 20 real testers on the build a week early would have caught at least three bugs I shipped to App Store review.

2. Write the native modules first, then build the RN screens against them. I did it the other way around — built the RN UI, then realized I needed the pose detection module, scrambled to write it, found that its return shape didn't quite match what my UI expected. Doing the native modules first defines the data contract, and then the RN side is just composing against it.

3. Build the iOS landing page (/ios) WHILE the app is in App Store review. I built it after approval and lost ~10 days of organic discovery I could have been earning. The /ios page should be ready, indexed, and getting traffic by the time the App Store listing goes live so they reinforce each other.

The honest take on Expo + AI for solo founders

Pure Swift gets you a slightly better app. Expo + AI-assisted conversion gets you a shipped app in 4× less time with 95% of the user-facing quality. For most solo founders, the second one is the right tradeoff.

The Expo Modules API is the unlock. You're not committing to "everything in JS." You're committing to "JS by default, native where it matters." For an AI fitness app where the camera pipeline is the entire product, having an escape hatch into Swift for that specific subsystem is exactly what you need.

If you're sitting on a web app and you're considering native iOS, I'd skip the Capacitor approach unless your app is genuinely a thin layer over a website. Expo + selective native modules is the path I'd recommend without reservation.

The numbers, 30 days in

  • Web: 4,779 users, still growing
  • iOS: 406 installs, [number] of which converted to Pro
  • App Store rating: ★★★★★ from 3 reviews so far

If you've gone from web-first to native iOS with Expo, I'd love to hear what surprised you most. I'm at fitnitapp.com.

And if you'd like to try it on iOS : Fitnit on iOS

Top comments (0)