DEV Community

Haris
Haris

Posted on

Now you can control your phone with offline AI — No APIs, no cloud, no latency

The Future of Local "Agentic" AI: FunctionGemma on Android

The most interesting part of this recent experimental release isn't the chatbot—it's a 270M parameter model called FunctionGemma.

It’s specifically designed for "Mobile Actions" (local function calling). This means you can control Android OS features (flashlights, intents, system settings) using natural language entirely offline.

What makes this a "must-click" for developers:

  • No API Keys: You aren't paying OpenAI or Google for tokens. The model lives in your phone's storage.
  • The .litertlm Format: It uses a specialized lightweight runtime that targets the mobile CPU/GPU directly.
  • Open Source Recipes: The dataset and "recipes" have been released on GitHub so you can fine-tune the 270M model for your own custom app functions.

It’s essentially a glimpse into how "Agentic" AI will work on-device without the latency or privacy concerns of a cloud round-trip.

Full setup guide and technical breakdown: https://blog.harislab.tech/3/now-you-can-control-your-phone-with-offline-ai-no-apis-no-cloud-no-latency

Top comments (4)

Collapse
 
haris18 profile image
Haris

The minimum requirements for running AI Locally On your Android Phone is 4GB Ram With Minimum Android 12

Collapse
 
haris18 profile image
Haris

You can smoothly run Gemma3-1B-IT AI Model on low end devices

Collapse
 
haris18 profile image
Haris

But you can also run Gemma-3n-E2B-it AI Model Which is heavier than the Gemma3-1B-IT AI Model But Gemma-3n-E2B-it AI Model Support Image Analyzing Audio Scribe AI Chat And Prompt Lab

Collapse
 
haris18 profile image
Haris

I tried on my phone and it works well but heavy models cause ui crashing and freezing issue on low end devices