DEV Community

Hamber
Hamber

Posted on

Building a Handheld Console with Flutter

The Afternoon I Got Stuck on a Japanese Dialogue

One weekend last year, I was playing a classic GBA RPG and hit an NPC conversation I couldn't read — all in Japanese. I screenshotted it, switched to a translation app, looked it up, switched back. The game had already moved on.

That context switch was deeply annoying.

I'm a Flutter GDE, and I happened to have a GBA emulator project called GoGBA sitting on my machine. I thought: what if I could press one button, without ever leaving the game, and have AI read the screen and translate it for me?

This article is the complete story of going from that idea to a shipped feature. The stack: Flutter + mGBA + Firebase AI (Gemini) + Riverpod + Clean Architecture. All real production code.

GoGBA is live on the App Store and Google Play — search GoGBA to download it.


Part 1: Architecture — Can Flutter Actually Run an Emulator?

Why Not Go Native

Emulators are performance-sensitive, so the instinct is "Flutter isn't fast enough." But GoGBA's emulation core is libretro/mGBA — a battle-tested C/C++ engine. Flutter only handles UI and event dispatch; it never touches the emulation logic.

That's what makes cross-platform viable:

Flutter UI (Dart)
      ↓  MethodChannel / EventChannel
Kotlin (Android) / Swift (iOS)
      ↓  JNI / C FFI
libretro mGBA (C/C++)
Enter fullscreen mode Exit fullscreen mode

Flutter renders the game screen using the Texture widget — the native layer writes mGBA's framebuffer into a SurfaceTexture (Android) or CVPixelBuffer (iOS), and Flutter composites it directly. Zero-copy. 60fps with no issues.

Designing the Channel Boundaries

GoGBA uses three channels:

static const MethodChannel _channel =
    MethodChannel('go_gba/emulator');
static const MethodChannel _audioChannel =
    MethodChannel('go_gba/audio');
static const EventChannel _eventChannel =
    EventChannel('go_gba/emulator_events');
Enter fullscreen mode Exit fullscreen mode
  • _channel: command traffic — load ROM, save state, cheats
  • _audioChannel: separated to prevent audio calls from blocking the game loop
  • _eventChannel: native-initiated events — RetroAchievements unlocks, leaderboard updates

The EventChannel is the design decision most people miss. Emulator events happen asynchronously on the native side. Polling with MethodChannel is wasteful. Surfacing them as a Dart Stream via EventChannel means a Riverpod provider can just watch it — fully reactive, no polling, no glue code.


Part 2: Can Clean Architecture Actually Work in Flutter?

Why Bother With Layers

GoGBA's early code was all crammed into PlayPage — emulator calls, save logic, and UI state tangled together. When cloud saves, cheats, and AI translation needed to be added, every change rippled unpredictably.

Clean Architecture's real value isn't aesthetics. It's letting features evolve independently.

GoGBA's layer structure:

pages / widgets / providers   ← Presentation
        ↓
domain/usecases               ← Application (business rules)
        ↓
domain/entities, ports,       ← Domain (pure Dart, no Flutter/dart:io)
repositories (interfaces)
        ↑ implements
data/repositories, core/emulator  ← Data / Infra
        ↓ MethodChannel
Kotlin / Swift / mGBA (native)
Enter fullscreen mode Exit fullscreen mode

The hard rule: domain/ cannot import `package:flutter/, dart:io, or anything from data/`.** Not a suggestion — a rule.

Enforcing Architecture with custom_lint

Code review alone will eventually miss things. GoGBA uses custom_lint to turn these constraints into compile-time errors:

# analysis_options.yaml
analyzer:
  plugins:
    - custom_lint
Enter fullscreen mode Exit fullscreen mode

Two custom rules enforce the boundaries:

  • gogba_domain_layer_dependencies: blocks flutter / dart:io / data imports in domain
  • gogba_presentation_no_data_imports: blocks presentation from reaching into data directly

Now if anyone writes import 'package:flutter/material.dart' inside domain/, flutter analyze fails and CI catches it. The rule lives in the toolchain, not in someone's memory.

This is the single most effective architecture enforcement technique I've used in a real Flutter project.

Port/Adapter for Cross-Layer Dependencies

Riverpod providers need to read and write app config — but they shouldn't import ConfigDatasource directly (that's a data-layer type). GoGBA's solution:

// domain/ports/app_config_storage_port.dart (interface, pure Dart)
abstract class AppConfigStoragePort {
  Future<AppConfig> load();
  Future<void> updateConfig(AppConfig config);
}

// data/adapters/ (implementation)
class ConfigDatasourceAppConfigStorageAdapter
    implements AppConfigStoragePort { ... }

// providers/ (composition root)
final appConfigStoragePortProvider = Provider<AppConfigStoragePort>((ref) {
  return ConfigDatasourceAppConfigStorageAdapter();
});
Enter fullscreen mode Exit fullscreen mode

Presentation depends only on the port interface. Tests swap in a fake. Widget tests don't need to touch the filesystem.


Part 3: AI Real-Time Translation — One Button, Three Technical Layers

This is my favorite feature in GoGBA, and the most interesting engineering problem in the project.

Layer 1: Capturing the Game Screen

The GBA screen is a native texture — not a regular Flutter widget. You can't screenshot it the normal way.

GoGBA wraps the game view in a RepaintBoundary, then uses RenderRepaintBoundary.toImage() to capture the current frame. The expensive encoding work runs in a separate isolate:

// lib/core/utils/game_texture_capture.dart
Future<Uint8List?> captureGameTextureAsJpeg(
  GlobalKey key, {
  int? targetWidth,
  int? targetHeight,
}) async {
  final boundary = key.currentContext
      ?.findRenderObject() as RenderRepaintBoundary?;
  if (boundary == null) return null;

  final image = await boundary.toImage(pixelRatio: 1);
  final byteData =
      await image.toByteData(format: ui.ImageByteFormat.png);
  if (byteData == null) return null;

  final pngBytes = byteData.buffer.asUint8List();

  // PNG → resize → JPEG runs in an isolate — main thread stays unblocked
  return compute(_encodePngBytesToJpeg, (
    pngBytes: pngBytes,
    targetWidth: targetWidth,
    targetHeight: targetHeight,
  ));
}
Enter fullscreen mode Exit fullscreen mode

The compute() call is the key detail. Image encoding and resizing happen in a dedicated isolate — the main thread stays responsive and the game keeps running without a hitch.

Layer 2: Gemini Multimodal Translation

GoGBA uses Firebase AI Logic (Vertex AI on Firebase) with the firebase_ai package:

// lib/core/services/game_screen_translation_service.dart
final GenerativeModel _model = FirebaseAI.vertexAI(location: 'global')
    .generativeModel(
      model: 'gemini-3.1-flash-lite-preview',
      generationConfig: GenerationConfig(
        maxOutputTokens: 512,
        temperature: 0.1,
        topP: 0.95,
        // Translation doesn't need reasoning — disable to save latency and tokens
        thinkingConfig: ThinkingConfig.withThinkingBudget(0),
      ),
    );

Future<String> translateJpeg({
  required List<int> jpegBytes,
  required String targetLanguageTag,
}) async {
  final prompt =
      'GBA screenshot: pixel UI. Transcribe all visible on-screen text, '
      'then translate it into "$targetLanguageTag". '
      'Use natural RPG/menu phrasing. '
      'Output only the translation text, no scene summary or extra commentary. '
      'If no readable text, reply exactly: No text detected.';

  final response = await _model.generateContent([
    Content.multi([
      InlineDataPart('image/jpeg', Uint8List.fromList(jpegBytes)),
      TextPart(prompt),
    ]),
  ]);
  return response.text?.trim() ?? '';
}
Enter fullscreen mode Exit fullscreen mode

The prompt engineering choices are deliberate:

  • Use natural RPG/menu phrasing: keeps translations in-genre — "HP" won't become "Health Points"
  • Output only the translation text: strips the model's boilerplate preamble
  • If no readable text, reply exactly: No text detected.: structured fallback the client can match on
  • temperature: 0.1: translation is a deterministic task; higher temperature just adds noise
  • ThinkingConfig.withThinkingBudget(0): Gemini 2.x enables thinking by default; for translation it adds latency and tokens with no benefit — explicitly disable it

Layer 3: Monthly Quota Management

AI calls have real costs. GoGBA's AI translation is a separate subscription. Monthly usage limits are served from Firebase Remote Config, so they can be adjusted without a release:

// lib/domain/services/game_screen_translation_quota_service.dart
class GameScreenTranslationQuotaService {
  static String _currentUtcYm() {
    final u = DateTime.now().toUtc();
    return '${u.year.toString().padLeft(4, '0')}'
        '-${u.month.toString().padLeft(2, '0')}';
  }

  Future<bool> isExhausted(int monthlyLimit) async {
    if (monthlyLimit <= 0) return true;
    final uses = await getUsesThisMonth();
    return uses >= monthlyLimit;
  }

  Future<void> recordSuccessfulTranslation() async {
    final prefs = await SharedPreferences.getInstance();
    final count = await _usesForCurrentUtcMonth(prefs);
    await prefs.setInt(_keyCount, count + 1);
  }
}
Enter fullscreen mode Exit fullscreen mode

UTC month, not local time. Users span time zones. Local time means the quota resets at different moments for different people. UTC is the only fair counting window.

Wiring It Together: the UI Layer

// lib/pages/play/widgets/game_translation_bottom_sheet.dart
Future<void> _run() async {
  // 1. Capture the frame
  final jpeg = await captureGameTextureAsJpeg(
    key, targetWidth: vw, targetHeight: vh,
  );

  // 2. Translate with Gemini (follows system language)
  final target =
      LocaleSettings.currentLocale.flutterLocale.toLanguageTag();
  final text = await GameScreenTranslationService.instance.translateJpeg(
    jpegBytes: jpeg,
    targetLanguageTag: target,
  );

  // 3. Show result and record quota
  setState(() {
    _phase = _TranslationPhase.success;
    _resultText = text;
  });
  if (widget.recordUsageOnSuccess) {
    await GameScreenTranslationQuotaService.instance
        .recordSuccessfulTranslation();
  }
}
Enter fullscreen mode Exit fullscreen mode

What the user experiences: press the translate button → bottom sheet slides up → spinner for a second or two → translation appears. Behind that: frame capture, isolate encoding, multimodal AI call, quota write. All async. Game never pauses.


Part 4: AI-Assisted Development — What It Actually Feels Like

GoGBA's development workflow is deeply integrated with Claude Code (Anthropic's AI coding assistant). As a solo developer, it lets me maintain the kind of engineering discipline that normally takes a team.

A few real examples:

Architecture enforcement: The project's SKILL.md documents the domain layer rules and forbidden patterns. Claude Code reads this before every change and won't suggest code that violates the layering — the constraints stay consistent without manual review.

i18n automation: GoGBA ships in 24 languages. When a new feature adds UI strings, Claude Code fills in all language files in l10n/*.i18n.json, then triggers dart run slang to regenerate. What used to take 20 minutes of copy-paste takes seconds.

Fastlane releases: Build number bumps, changelog generation, App Store submission — all scripted. Claude Code runs the sequence and catches problems.

This isn't "AI replacing the developer." It's AI reducing the cost of following your own rules to near zero. Write the standards once; the tool enforces them.


Part 5: Bugs That Taught Me Things

Bug 1: ref.read() inside dispose() crashes

Riverpod's ref.read() and ref.watch() cannot be called after dispose(). The widget is gone; the provider may have already been released. GoGBA had a handful of early crashes from this. The rule is now in SKILL.md and detected by custom_lint.

Bug 2: invalidate(provider) causes an AsyncLoading flash

Calling invalidate after updating config forces the provider to rebuild from scratch, briefly putting the UI into a loading state. Users see a flicker. The fix: update state directly with state = newValue inside the notifier and let Riverpod diff it. No invalidation needed.

Bug 3: Gemini's thinking mode is on by default

The firebase_ai Gemini 2.x models enable extended thinking by default. For translation — a deterministic task — this means longer latency, more tokens, and less predictable output. You have to explicitly disable it with ThinkingConfig.withThinkingBudget(0). The default bit me in early testing.


Closing Thoughts

GoGBA is my testbed for engineering ideas: where Flutter's cross-platform ceiling actually sits, whether Clean Architecture can hold up in a real project without becoming an interview-question abstraction, how to design AI features that are genuinely useful rather than just impressive in a demo.

My conclusions: Flutter is mature enough for this. AI tooling is raising the ceiling for individual developers in ways that weren't possible two years ago.

Every specific choice in this codebase — custom_lint guarding domain boundaries, compute() keeping the main thread clean, UTC-based quota windows — is a scar from a real mistake. I hope some of it saves you the same trouble.

Search GoGBA on the App Store or Google Play. If you play GBA games in Japanese or English and hit a text wall, the AI translation feature is there for exactly that.


Questions about Flutter cross-platform development, Firebase AI Logic integration, or shipping a solo app with an AI workflow? Drop them in the comments.

Top comments (0)