This guide documents a real setup process of turning an Android phone into a portable AI development environment using Termux, Ubuntu (proot), Node.js, Ollama, and OpenClaw.
The goal: run modern AI coding tools on a mobile device without root.
🧱 1. Base Setup: Termux + Ubuntu
We start by installing Termux and setting up a Linux environment.
Install Termux packages:
pkg update && pkg upgrade -y
pkg install proot-distro git curl wget -y
Install Ubuntu:
proot-distro install ubuntu
proot-distro login ubuntu
Now we have a full Linux environment running on Android.
⚙️ 2. Installing Node.js (Critical Step)
Many modern AI tools require Node.js 22+.
Initial issue encountered:
«Node.js version mismatch (required 22.12+, installed 20.x)»
Fix:
Install Node.js using NVM (recommended for Android):
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash
source ~/.bashrc
nvm install 22
nvm use 22
nvm alias default 22
Verify:
node -v
npm -v
🤖 3. Installing Ollama (Local AI Models)
Ollama allows running local LLMs directly on the device.
Install:
curl -fsSL https://ollama.com/install.sh | sh
Start server:
ollama serve
Run a model:
ollama run qwen2.5-coder:3b
For mobile devices, lightweight models are recommended:
- qwen2.5-coder:3b
- phi4-mini
- gemma3:4b
🧠 4. Installing Claude Code
Claude Code is an AI coding CLI tool.
Install:
npm install -g @anthropic-ai/claude-code
Issue encountered:
- “native binary not installed”
- caused by Android/proot incompatibility
Fix:
Run inside Ubuntu environment only, not raw Termux.
⚠️ 5. Fixing npm Installation Errors
A major issue appeared:
ENOENT rename /root/.npm/_cacache/tmp
Invalid response body from registry
Fix:
rm -rf ~/.npm
npm cache clean --force
npm config set cache /tmp/npm-cache
mkdir -p /tmp/npm-cache
Then retry installation.
🔧 6. Installing OpenClaw (AI Agent)
OpenClaw is an AI agent system that can automate coding tasks.
Install:
npm install -g openclaw
Issue encountered:
- Node version requirement mismatch (Node 22.12 required)
Fix:
Upgrade Node.js to version 22 using NVM.
⚙️ 7. Configuring OpenClaw
Create config file:
nano ~/.openclaw/config.json
Example configuration using Ollama:
{
"agent": {
"model": "ollama/qwen2.5-coder:3b"
},
"providers": {
"ollama": {
"base_url": "http://127.0.0.1:11434"
}
},
"features": {
"allow_shell": true,
"allow_file_edit": true
}
}
🚀 8. Running the Full Stack
Start Ollama:
ollama serve
Start OpenClaw:
openclaw chat
Now the system can:
- analyze code
- run shell commands
- use local AI models
- automate development tasks
🧩 9. Key Issues Encountered
- Node.js version mismatch
Fixed using NVM (Node 22)
- npm cache corruption
Fixed by clearing ~/.npm and using /tmp cache
- Android proot filesystem issues
Fixed by avoiding Termux-native installs and using Ubuntu environment
- Ollama connectivity issues
Fixed using 127.0.0.1 instead of localhost
🧠 Final Architecture
Android (Termux)
↓
Ubuntu (proot)
↓
Node.js 22 + NVM
↓
Ollama (local AI)
↓
OpenClaw (AI agent)
↓
Claude API (optional cloud intelligence)
🔥 Conclusion
Android can be transformed into a full AI development workstation using Termux and Ubuntu without root.
While some tools (like OpenClaw and Claude Code) require workarounds due to Linux/Android differences, a hybrid setup of local + cloud AI makes the system powerful and practical.
This setup is ideal for:
- AI coding on the go
- learning Linux
- building automation agents
- experimenting with LLMs locally
If you want to improve this setup further, the next step is integrating VS Code (code-server) and connecting remote GPU servers for heavy models.


Top comments (0)