DEV Community

labs aivane
labs aivane

Posted on

Why I’m building local-first Android automation for AI agents

AI agent demos often look magical — until they try to operate a real Android device.

Over the past months I’ve been building AIVane / AI‑RPA, a local‑first Android automation workflow designed specifically for AI agents. The goal isn’t to claim “AI can use your phone like a human,” but to explore whether agent‑driven Android automation can be practical when privacy, control, and repeatability are treated as first‑class constraints.

Here’s the current model:

A lightweight Android app runs locally on the device
A computer connects over LAN
The agent can:

  1. inspect the UI
  2. tap / type / swipe
  3. take screenshots
  4. navigate like a human operator

Why local‑first?

  1. No cloud relay
  2. No privacy concerns
  3. No unpredictable latency
  4. Fully reproducible automation
  5. Works even on offline devices

If you’re interested in AI agents, Android automation, or building local‑first tools, I’d love feedback.
Repo is here: https://github.com/aivanelabs/ai-rpa

Top comments (3)

Collapse
 
jtkcjtkc profile image
JT THOMAS (JT)

I'm loving it!
I will collab with you or at least be a tester for PRs/commits...
I've done quite a lot of this with scrcpy and shell commands but I really like your approach here!

Collapse
 
labs_aivane_98191a4f3eab1 profile image
labs aivane

Thank you — I really appreciate it.

It’s especially valuable to hear this from someone who has already worked with scrcpy and shell-command-based workflows.

And yes, I’d definitely welcome you as a tester. If you try it and run into bugs, friction points, missing capabilities, or things the product still can’t do well, please feel free to share them here or open an issue/discussion on GitHub.

That kind of real feedback would help a lot right now.

Some comments may only be visible to logged-in visitors. Sign in to view all comments.