Is this Zuck as a Navi?
PoopingCough
I don't know much about cybersecurity but from what I understand about how LLM models work, there was always going to be a limit to what they can actually do. They have no understanding; they're just giant probability engines, so the 'hallucinations' that happen aren't something solvable, they are inherent in the design of the models. And it's only going to get worse, as training data is going to be more and more difficult to find without being poisoned by current llm output.
No, but it is an extremely common replacement for regular sugar, so common as to be near unavoidable.
Does it work on steam deck/ linux?
It's crazy, I've seen you comment something similar on multiple posts in this community and yet you've only posted one here and it was over a year ago. Quite the entitlement.
Which is crazy because like... you think they would have heard of linux before