LocalAI is pretty good but resource-intensive. I ran it on a vps in the past.
mitexleo
joined 1 month ago
What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
It's fully open source and free (as in beer).
What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
You should try https://cherry-ai.com/ .. It's the most advanced client out there. I personally use Ollama for running the models and Mistral API for advnaced tasks.
What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
You should try https://cherry-ai.com/ .. It's the most advanced client out there. I personally use Ollama for running the models and Mistral API for advnaced tasks.
Many are switching to Nextcloud Talk.
Thank you
Is it possible to follow accounts on Piefed?
Drill baby drill!
And then drop him into one of the holes.
Never heard of plebbit before. Looks interesting!
All I do is look into the open issues, the community, docs etc. I don't remember auditing the code.
I was thinking about X employees accessing the chat..
Time to pin Odysee and Peertube in my browser even though I pay for YT Premium (only $2 per month here in BD).