A.I. designed for you, only for you.
Run powerful language models locally. Keep your conversations private. Switch between models without reinstalling.
Key Features
On-Device AI
Run advanced AI models directly on your iPhone, even offline. No servers or internet connection required.
Privacy by Design
Your conversations never leave your device. Arbiter collects no data, no accounts, and no cloud logging.
Flexible Model Management
Download lightweight open-source models like Gemma, LLaMA, or TinyLlama. Switch between them to balance speed, accuracy, and memory use.
Customizable Experience
Choose your preferred model, manage downloads, and adjust settings to match your workflow and device.
Persistent Conversations
Chats are securely stored on-device under your control, so you can revisit them anytime without losing context.
No Hidden Costs
By running AI locally, Arbiter avoids expensive cloud fees — giving you sustainable performance without subscriptions.
Our Mission
Arbiter is redefining the future of AI chat. We believe in empowering individuals and organizations with tools that offer choice, control, and clarity. By running AI locally and giving users ownership of their data, Arbiter puts privacy and power back into your hands.
FAQ
Is my data private?
Yes. Everything is processed and stored on your device. Your chats and files never leave your phone unless you explicitly choose to share them. We do not store, log, or sell your data. Your information stays 100% under your control.
Do I need an internet connection to use the app?
No. Once installed, the app works entirely offline. You can use it on airplanes, in remote areas, or without Wi-Fi, while still getting fast and responsive answers.
How does offline AI work?
The app uses optimized large language models (LLMs) that run locally on your device's processor. These models are pre-downloaded and don't require server access, which means your requests are processed instantly without sending them to the cloud.
What devices are supported?
Currently, the app supports modern iPhones and iPads running iOS 16 or later, with best performance on devices equipped with Apple Silicon chips (A14 Bionic or newer). Most models will require a device with more than 4 GB of RAM.
What AI models are available?
You can choose from a range of open-source LLMs. The app supports switching models at any time so you can balance speed, accuracy, and device performance.