Camel Chat
Camel Chat is a feature-rich Flutter application designed to provide a seamless interface for communicating with large language models (LLMs) served via an Ollama server. It offers a user-friendly way to interact with open-source AI models on your own hardware.
Features
- Connect to Ollama Servers: Easily connect to any Ollama server with optional basic HTTP authentication.
- Multiple Model Support: Chat with any model available on your Ollama server.
- Complete Chat History: View and manage your conversation history.
- Dark Mode Support: Switch between light and dark themes for comfortable viewing.
- Custom System Prompts: Define system prompts to set the AI's behaviour and context.
- Export Conversations: Export your chats as markdown files for sharing or archiving.
- Chat Organisation: Auto-generated meaningful titles for your conversations.
- Responsive UI: Works seamlessly on both mobile and desktop devices.
- Code Formatting: Proper rendering and formatting of code blocks in responses.
- Local Storage: All your conversations are stored locally for privacy.
Getting Started
Prerequisites
- A running Ollama server (local or remote).
Installation
Android
Download and install the APK from the releases page.
Linux
Choose one of the following packages from the releases page:
- Debian/Ubuntu: Download and install the
.deb
package. - Fedora/RHEL: Download and install the
.rpm
package. - Arch: Download and install
.zst
package. - Other distributions: Download the AppImage, make it executable and run it.
Setting Up Your Ollama Server
- Install Ollama from https://ollama.com/.
- Pull the models you want to use (e.g.,
ollama pull gemma3
). - Run the Ollama server.
- Connect Camel Chat to your server by entering the URL (e.g.,
http://localhost:11434/
).
Roadmap
Here are some features and improvements planned for future releases:
- Stream Responses: Implement streaming responses for more interactive conversations.
- File Attachments: Upload and process files during conversations.
- Chat Statistics: View usage statistics and performance metrics.
- Release on Flathub
- Windows & macOS Support
I use Nobara on my laptop which has rtx 3060 6gb with ryzen 7 5800h. Sure Nvidia sucks on every linux distro, but you will get many quality of life improvements when using linux instead of windows.
Btw, Nobara is just Fedora with some good gaming related chages.