otto

joined 2 years ago
 

It has been long in the coming (Oracle bought Sun and MySQL over 15 years ago), but seems WordPress is finally at the point where MariaDB popularity surpassed MySQL as shown by stats at https://wordpress.org/about/stats/.

The share of MySQL 8.4 users is oddly low, just 0.1 %. One would think it would still be at least 1% or something..

[–] otto@programming.dev 1 points 2 months ago (1 children)

By UV 3000 you probably don't mean the ultraviolet lamp that is the first page of Google is full of when searching with this term..? I doubt UV - whatever it is - is a common approach.

 

What are your strategies when a MySQL/MariaDB database server grows to have too much traffic for a single host to handle, i.e. scaling CPU/RAM is not an option anymore? Do you deploy ProxySQL to start splitting the traffic according to some rule to two different hosts? What would the rule be, and how would you split the data? Has anyone migrated to TiDB? In that case, what was the strategy to detect if the SQL your app uses is fully compatible with TiDB?

 

Besides having the latest version available, what do Debian users who run MariaDB wish to see in future versions of MariaDB, or how it is integrated and packaged in Debian?

I am the maintainer in Debian - looking for feedback and ideas.

[–] otto@programming.dev 1 points 2 months ago

MariaDB supports Galera clustering out-of-the-box, and also traditional primary/replica setups. But you need to have something that spans multiple hosts to monitor and manage it, and that is outside of what a single-host OS package management system can do.

[–] otto@programming.dev 1 points 2 months ago (1 children)

You mean ollama? There are so many options, any favorites?

 

Besides having the latest version available, what do Ubuntu users who run MariaDB wish to see in future versions of MariaDB, or how it is integrated and packaged in Ubuntu?

I am the maintainer in Ubuntu - looking for feedback and ideas.

 

I’ve been exploring MariaDB 11.8’s new vector search capabilities for building AI-driven applications, particularly with local LLMs for retrieval-augmented generation (RAG) of fully private data that never leaves the computer. I’m curious about how others in the community are leveraging these features in their projects.

I’m especially interested in using it with local LLMs (like Llama or Mistral) to keep data on-premise and avoid cloud-based API costs or security concerns.

Does anyone have experiences to share, in particular what LLMs are you using when generating embeddings to store in MariaDB?

 

The XZ Utils backdoor, discovered last week, and the Heartbleed security vulnerability ten years ago, share the same ultimate root cause. Both of them, and in fact all critical infrastructure open source projects, should be fixed with the same solution: ensure baseline funding for proper open source maintenance.

 

As aliases

alias g-log="git log --graph --format='format:%C(yellow)%h%C(reset) %s %C(magenta)%cr%C(reset)%C(auto)%d%C(reset)'"
alias g-history='gitk --all &'
alias g-checkout='git checkout $(git branch --sort=-committerdate --no-merged | fzf)'
alias g-commit='git citool &'
alias g-amend='git citool --amend &'
alias g-rebase='git rebase --interactive --autosquash'
alias g-pull='git pull --verbose --rebase'
alias g-pushf='git push --verbose --force-with-lease'
alias g-status='git status --ignored'
alias g-clean='git clean -fdx && git reset --hard && git submodule foreach --recursive git clean -fdx && git submodule foreach --recursive git reset --hard'
 

Pulsar (former Atom) is still the best code editor in my opinion. It is easiest and fastest to use, has all the nice productivity boosting plugins and is overall great for all the same reasons the Atom was great. 🚀

See also !pulsaredit@lemmy.ml

 

Habits can be sustained for years and years. Goals often compel acts of heroism, which are not sustainable in the long run. As Bruce Lee once said, “long-term consistency trumps short-term intensity.”