Your old Android phone is not e-waste. Itβs a powerful ARM64 server waiting to happen.
OCA seamlessly installs the OpenClaw AI ecosystem directly onto your device via Termux. This completely bypasses sluggish Linux distributions (like Ubuntu on Proot), running natively with full glibc compatibility instead of Androidβs default Bionic libraries.
| Before OCA β | After OCA β |
|---|---|
| Slow Proot containers | Native ARM64 execution |
| Bionic libc limitations | Full glibc compatibility |
| Manual setup (hours) | One-command install (2 mins) |
| Limited AI tools | 4+ AI CLIs pre-configured |
| No remote access | SSH server included |
| Static installation | Auto-updating ecosystem |
Configure Developer Options, Stay Awake, and battery optimization to prevent Android from killing Termux.
π Read Phone Setup Guide for step-by-step instructions.
β οΈ Important: The Play Store version of Termux is discontinued. You must install from F-Droid.
Open Termux and run:
pkg update -y && pkg install -y curl
curl -sL https://raw.githubusercontent.com/PsProsen-Dev/OpenClaw-On-Android/master/bootstrap.sh | bash && source ~/.bashrc
Takes 3-10 minutes depending on network and device.
openclaw onboard
Follow the on-screen instructions.
openclaw gateway
β οΈ Important: Run directly in Termux app, not via SSH. SSH session disconnect will stop the gateway.
After installation, use the oa command for managing your installation:
| Command | Description |
|---|---|
oa --update |
Update OpenClaw and Android patches |
oa --install |
Install optional tools (tmux, code-server, AI CLIs) |
oa --uninstall |
Remove OpenClaw on Android |
oa --status |
Show installation status |
oa --version |
Show version |
oa --help |
Show available options |
Update example:
oa --update && source ~/.bashrc
This updates: OpenClaw core, code-server, OpenCode, AI CLI tools, Android patches
OCA now supports local LLM inference via node-llama-cpp and Ollama (including cloud models!).
Run powerful models in the cloud β zero local RAM/storage usage!
# Pull and launch with cloud model
ollama pull kimi-k2.5:cloud
ollama launch openclaw --model kimi-k2.5:cloud
Recommended Cloud Models:
kimi-k2.5:cloud - Multimodal reasoning (64k context)minimax-m2.5:cloud - Fast coding (64k context)glm-5:cloud - Reasoning & code generationgpt-oss:120b-cloud - High-performance (128k context)# Option 1: node-llama-cpp (Recommended)
npm install -g node-llama-cpp --ignore-scripts
# Option 2: Ollama (Full Server)
curl -fsSL https://ollama.com/install.sh | sh
| Model | Size | RAM Needed | Speed | Best For |
|---|---|---|---|---|
| TinyLlama 1.1B | ~670MB | 2GB | β‘β‘β‘ | Testing |
| Phi-3 Mini | ~2.3GB | 4GB | β‘β‘ | Light tasks |
| Llama 3.2 1B | ~670MB | 2GB | β‘β‘β‘ | Mobile-friendly |
| Mistral 7B | ~4.1GB | 8GB | β‘ | Advanced only |
π Read Full Local LLM Guide for detailed setup, troubleshooting, and cloud comparison.
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Android Device (Termux) β
β β β
β βΌ β
β ββββββββββββββββββββ β
β β glibc-runner β β
β β (ld.so wrapper) β β
β ββββββββββ¬ββββββββββ β
β β β
β βΌ β
β ββββββββββββββββββββ β
β β Node.js v24 β β
β β linux-arm64 β β
β ββββββββββ¬ββββββββββ β
β β β
β βββββββββββββββ΄βββββββββββββββ β
β βΌ βΌ β
β ββββββββββββββββ ββββββββββββββββ β
β β OpenClaw β β Local LLM β β
β β Gateway β β (Optional) β β
β β β β β β
β β β’ AI CLIs β β β’ llama.cpp β β
β β β’ SSH β β β’ Ollama β β
β β β’ clawdhub β β β’ GGUF β β
β ββββββββββββββββ ββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
glibc-runner: Injects ld-linux-aarch64.so.1 to bypass Androidβs restricted linker/tmp, /bin/sh) dynamically mapped to Termux prefixesglibc-compat.js polyfills os.cpus() and os.networkInterfaces() for V8| π Guide | Description |
|---|---|
| π Quick Start | Get running in 5 minutes |
| π± Phone Setup | Developer Options, Stay Awake, Battery |
| π§ Installation | Full 8-step installer breakdown |
| π€ AI CLI Tools | Qwen, Claude, Gemini, Codex setup |
| π¦ Local LLM | Run models locally (node-llama-cpp, Ollama) |
| π Dashboard Connect | Multi-device management from PC |
| π SSH Setup | Remote access configuration |
| βοΈ Configuration | Manage settings and preferences |
| π§ Troubleshooting | Common errors and fixes |
| π» Phantom Process Killer | Android 12+ fix |
π Browse all docs:
docs/
If you see [Process completed (signal 9)], Androidβs Phantom Process Killer has terminated Termux.
Fix it in 30 seconds:
adb shell settings put global development_settings_enabled 1
adb shell settings put global max_phantom_processes 64
π Read Full Fix Guide
Android may kill background processes or throttle them when the screen is off. For 24/7 operation:
| Setting | Purpose | How To |
|---|---|---|
| Developer Options | Enable advanced controls | Settings β About β Tap Build Number 7x |
| Stay Awake | Prevent CPU throttling | Developer Options β Stay Awake |
| Battery Optimization | Prevent app killing | Settings β Apps β Termux β Battery β Unrestricted |
| Charge Limit | Protect battery during 24/7 use | Use AccuBattery or similar |
π Read Complete Phone Setup Guide for detailed instructions.
Access your OCA dashboard from PC browser via SSH tunnel:
# From your PC terminal
ssh -L 3000:localhost:3000 -L 8080:localhost:8080 u0_aXXX@192.168.X.X -p 8022
π Read SSH Setup Guide for complete instructions.
Running OCA on multiple devices? Use Dashboard Connect to manage them from your PC:
π‘ Tip: Name your devices (e.g., βOld Pixelβ, βBedroom Phoneβ) for easy identification.
OCA uses a platform-plugin architecture that separates platform-agnostic infrastructure from platform-specific code:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Orchestrators (install.sh, update-core.sh, uninstall.sh) β
β ββ Platform-agnostic. Read config.env and delegate. β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β Shared Scripts (scripts/) β
β ββ L1: install-infra-deps.sh (always) β
β ββ L2: install-glibc.sh, install-nodejs.sh (conditional) β
β ββ L3: Optional tools (user-selected) β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β Platform Plugins (platforms/<platform>/) β
β ββ config.env: declares dependencies β
β ββ install.sh / update.sh / uninstall.sh / ... β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Dependency layers:
| Layer | Scope | Examples | Controlled by |
|---|---|---|---|
| L1 | Infrastructure (always) | git, pkg update | Orchestrator |
| L2 | Platform runtime (conditional) | glibc, Node.js, build tools | config.env flags |
| L3 | Optional tools (user-selected) | tmux, code-server, AI CLIs | User prompts |
Core Infrastructure (L1): git
Platform Runtime (L2): pacman, glibc-runner, Node.js v24, python, make, cmake, clang, binutils
OpenClaw Platform: OpenClaw, clawdhub, PyYAML, libvips
Optional Tools (L3): tmux, ttyd, dufs, android-tools, code-server, OpenCode, Claude Code, Gemini CLI, Codex CLI
</div>
| Feature | Description |
|---|---|
| node-llama-cpp | Prebuilt binary support with --ignore-scripts |
| Ollama Integration | Full server with model management |
| Model Guide | Recommendations for RAM/Storage constraints |
| Cloud vs Local | Comparison table for decision making |
π¦ View Release Notes