Skip to content

n4od/Ollama-Android-Setup

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

📱 Install & Setup Ollama Local AI Agent on Android (Termux)

Run powerful local AI models directly on your Android device using Termux and Ollama — completely offline and free.


🚀 Overview

This guide walks you through:

  • Installing Termux
  • Setting up required packages
  • Installing Ollama
  • Running a local AI model

📦 Prerequisites

  • Android device (recommended: 6GB+ RAM for better performance)
  • Stable internet connection (for initial setup and model download)
  • Basic familiarity with terminal commands

1️⃣ Install Termux

Download Termux from F-Droid (recommended):

👉 https://f-droid.org/packages/com.termux/

⚠️ Avoid the Play Store version — it is outdated.


2️⃣ Setup Termux

Open Termux and run:

pkg update && pkg upgrade -y

Grant storage permission:

termux-setup-storage

3️⃣ Install Ollama

Install Ollama package:

pkg install ollama

4️⃣ Start Ollama Server

Ollama requires a background service to run:

ollama serve

💡 Keep this session running.


5️⃣ Run a Model

Open a new Termux session:

  • Swipe from the left
  • Tap "New Session"

Link the Ollama binary:

ln -s $PREFIX/bin/ollama $PREFIX/bin/serve

Run a model (example):

ollama run llama3.2:1b

🧠 Recommended Models

Model Size Notes
llama3.2:1b Small Fast, low resource usage
llama3:8b Medium Better quality
mistral Medium Balanced performance

⚙️ Tips for Better Performance

  • Close background apps
  • Use smaller models if your device lags
  • Keep device cool (AI workloads heat phones fast)

🛠 Troubleshooting

❌ No internet connection

  • Ensure network is active before downloading models

❌ Command not found

  • Restart Termux or re-run install steps

❌ Slow performance

  • Use a smaller model like llama3.2:1b

📌 Notes

  • First run downloads the model (can be large)
  • Works fully offline after download
  • Performance depends on your device hardware

👤 Author

n4od
GitHub: https://github.com/n4od


🤝 Contributing

Feel free to fork, improve, and submit pull requests.


📄 License

MIT License

About

Learn how to install Ollama on Android with Termux and run local AI models offline. Includes setup steps, commands, and performance tips.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors