DEV Community

Cover image for How To Run Ollama In Android (Without Root)
H4Ck3R
H4Ck3R

Posted on • Edited on

How To Run Ollama In Android (Without Root)

Yes, you can run Ollama directly on your Android device without needing root access, thanks to the Termux environment and its package manager. This turns your smartphone into a portable powerhouse for running local large language models (LLMs). 🚀

Run Ollama In Android

Read Full Post

Prerequisites

Before you start, you'll need two things:

  1. A modern Android device. Performance will heavily depend on your phone's RAM and processor. A device with at least 8 GB of RAM is recommended for a smoother experience.
  2. The Termux application. It's crucial to install it from F-Droid, as the version on the Google Play Store is outdated and no longer maintained.

Installation and Setup Guide

Follow these simple steps to get Ollama up and running.

1. Install and Update Termux

First, download and install Termux from the F-Droid app store. Once installed, open the app and update its core packages to ensure everything is current.

pkg update && pkg upgrade
Enter fullscreen mode Exit fullscreen mode

You will be prompted several times during the update; it's generally safe to answer with the default option, which is often 'Y' or 'N'.

2. Install Ollama

With Termux up to date, installing Ollama is as simple as running a single command. The Ollama package is now available in the official Termux repositories.

pkg install ollama
Enter fullscreen mode Exit fullscreen mode

This command will download and install the Ollama server and command-line interface (CLI) on your device.

3. Run the Ollama Server

Ollama operates as a background server that manages the models. You need to start this server before you can run and chat with an LLM. It's best practice to run this in its own dedicated Termux session.

Open Termux and start the server:

ollama serve
Enter fullscreen mode Exit fullscreen mode

You'll see some log output, indicating the server is running. You need to keep this session open. You can use the termux-wake-lock command before this to prevent your phone from killing the process when it sleeps.

4. Download and Run a Model

Now, open a new Termux session by swiping from the left edge of the screen and tapping "NEW SESSION". In this new terminal, you can interact with the Ollama CLI.

To download and start chatting with a model (for example, Mistral), use the run command:

ollama run mistral
Enter fullscreen mode Exit fullscreen mode

The first time you run this command, it will download the specified model, which might take some time and storage space. Subsequent runs will be much faster. Once downloaded, you can start chatting with the model directly in your terminal!


Important Tips

  • Model Size Matters: Mobile devices have limited resources. For better performance, start with smaller models like Phi-3 Mini, TinyLlama, or Gemma:2b. Running larger models like Llama 3 8B might be slow or crash if your phone doesn't have enough RAM.
  • Storage Space: LLMs are large files, often several gigabytes. Ensure you have enough free storage on your device before downloading models.
  • Keep it Running: Android's battery optimization can be aggressive. Use the termux-wake-lock command in the server session to prevent Termux from being shut down.

Top comments (0)