All models are wrong, but some are useful
Science Memes
Welcome to c/science_memes @ Mander.xyz!
A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.

Rules
- Don't throw mud. Behave like an intellectual and remember the human.
- Keep it rooted (on topic).
- No spam.
- Infographics welcome, get schooled.
This is a science community. We use the Dawkins definition of meme.
Research Committee
Other Mander Communities
Science and Research
Biology and Life Sciences
- !abiogenesis@mander.xyz
- !animal-behavior@mander.xyz
- !anthropology@mander.xyz
- !arachnology@mander.xyz
- !balconygardening@slrpnk.net
- !biodiversity@mander.xyz
- !biology@mander.xyz
- !biophysics@mander.xyz
- !botany@mander.xyz
- !ecology@mander.xyz
- !entomology@mander.xyz
- !fermentation@mander.xyz
- !herpetology@mander.xyz
- !houseplants@mander.xyz
- !medicine@mander.xyz
- !microscopy@mander.xyz
- !mycology@mander.xyz
- !nudibranchs@mander.xyz
- !nutrition@mander.xyz
- !palaeoecology@mander.xyz
- !palaeontology@mander.xyz
- !photosynthesis@mander.xyz
- !plantid@mander.xyz
- !plants@mander.xyz
- !reptiles and amphibians@mander.xyz
Physical Sciences
- !astronomy@mander.xyz
- !chemistry@mander.xyz
- !earthscience@mander.xyz
- !geography@mander.xyz
- !geospatial@mander.xyz
- !nuclear@mander.xyz
- !physics@mander.xyz
- !quantum-computing@mander.xyz
- !spectroscopy@mander.xyz
Humanities and Social Sciences
Practical and Applied Sciences
- !exercise-and sports-science@mander.xyz
- !gardening@mander.xyz
- !self sufficiency@mander.xyz
- !soilscience@slrpnk.net
- !terrariums@mander.xyz
- !timelapse@mander.xyz
Memes
Miscellaneous
Model fetishism triggers me.
Quantum Mechanics has entered the chat
Isn't that mostly probability as well?
Surprisingly that is a controversial view. Most physicists insist QM has nothing to do with probability! But then why does it only give you probabilistic predictions? Ye old measurement problem, an entirely fabricated problem because physicists cannot accept that a theory that gives you probabilities is obviously a probabilistic theory.
The wavestate is entirely deterministic, and we don't fully understand where the probabilistic measurement happens. The Copenhagen intrpretation makes it probabilistic but is not proven.
(even many worlds doesn't explain why we ourselves only see one macroscopic section of the wavefunction)
In any statistical theory, the statistical distribution, which is typically represented by a vector that is a superposition of basis states, evolves deterministcally. That is just a feature of statistics generally. But no one in the right mind would interpret the deterministic evolution of the statistical state as a physical object deterministically evolving in the real world. Yet, when it comes to QM, people insist we must change how we interpret statistics, yet nobody can give a good argument as to why.
We only "don't fully understand where the probabilistic measurement happens" if you deny it is probabilistic to begin with. If you just start with the assumption that it is a statistical theory then there is no issue. You just interpret it like you interpret any old statistical theory. There is no invisible "probability waves." The quantum state is an epistemic state, based on the observer's knowledge, their "best guess," of a system that is in a definite state in the real world, but they cannot know it because it evolves randomly. Their measurement of that state just reveals what was already there. No "collapse" happens.
The paradox where we "don't know" what happens at measurement only arises if you deny this. If you insist that the probability distribution is somehow a physical object. If you do so, then, yes, we "don't know" how this infinite-dimensional physical object which doesn't even exist anywhere in physical space can possibly translate itself to the definite values that we observe when we look. Neither Copenhagen nor Many Worlds have a coherent and logically consistent answer to the question.
But there is no good reason to believe the claim to begin with that the statistical distribution is a physical feature of the world. The fact that the statistical distribution evolves deterministically is, again, a feature of statistics generally. This is also true of classical statistical models. The probability vector for a classical probabilistic computer is mathematically described as evolving deterministically throughout an algorithm, but no sane person takes that to mean that the bits in the computer's memory don't exist when you aren't looking at them an infinite-dimensional object that doesn't exist anywhere in physical space is somehow evolving through the computer.
Indeed, the quantum state is entirely decomposable into a probability distribution. Complex numbers aren't magic, they always just represent something with two degrees of freedom, so we can always decompose it into two real-valued terms and ask what those two degrees of freedom represent. If you decompose the quantum state into polar form, you find that one of the degrees of freedom is just a probability vector, the same you'd see in classical statistics. The other is a phase vector.
The phase vector seems mysterious until you write down time evolution rules for the probability vector in quantum systems as well as the phase vector. The rules, of course, take into account the previous values and the definition of the operator that is being applied to them. You then just have to recursively substitute in the phase vector's evolution rule into the probability vector's. You then find that the phase vector disappears, because it decomposes into a function over the system's history, i.e. a function over all operators and probability vectors at all previous time intervals going back to a division event. The phase therefore is just a sufficient statistic over the system's history and is not a physical object, as it can be defined in terms of the system's statistical history.
That is to say, without modifying it in any way, quantum mechanics is mathematically equivalent to a statistical theory with history dependence. The Harvard physicist Jacob Barandes also wrote a proof of this fact that you can read here. The history dependence does make it behave in ways that are bit counterintuitive, as it inherently implies a non-spatiotemporal aspect to how the statistics evolve, as well as interference effects due to interference in its history, but they are still just statistics all the same. You don't need anything but the definition of the operators and the probability distributions to compute the evolution of a quantum circuit. A quantum state is not even necessary, it is just convenient.
If you just accept that it is statistics and move on, there is no "measurement problem." There would be no claim that the particles do not have definite states in the real world, only that we cannot know them because our model is not a deterministic model but a statistical model. If we go measure a particle's position and find it to be at a particular location, the explanation for why we find it at that location is just because that's where it was before we went to measure it. There is only a "measurement problem" if you claim the particle was not there before you looked, then you have difficulty explaining how it got there when you looked.
But no one has presented a compelling argument in the scientific literature that we should deny that it is there before we look. We cannot know what its value is before we look as its dynamics are (as far as we know) random, but that is a very different claim than saying it really isn't there until we look. This idea that the particles aren't there until we look has, in my view, been largely ruled out in the academic literature, and should be treated as an outdated view like believing in the Rutherford model of the atom. Yet, people still insist on clinging to it.
They pretend like Copenhagen and Many Worlds are logically consistent by writing enormous sea of papers upon papers upon papers, where it only seems "consistent" because it becomes so complicated that hardly anyone even bothers to follow along with it anymore, but if you actually go through the arguments with a fine-tooth comb, you can always show them to be inconsistent and circular. There is only a vague aura of logical and mathematical consistency on the surface. The more you actually engage with both the mathematics and read the academic literature on quantum foundations, the more clear it becomes how incoherent and contrived attempts to make Copenhagen and Many Worlds consistent actually are, and how no one in the literature has actually achieved it, even though many falsely pretend they have done so.
I'm pretty sure this goes against the properties proven of entanglement (Bell test) and how far entanglement can propagate, but I don't know enough about quantum mechanics to explain why this explanation is incompatible with entanglement.
However, I don't currently see how this at all explains computing with superpositions; if it's just statistics a superposition can never exist, so entanglement doesn't exist; so quantum algorithms wouldn't be possible, but we know they are.
I’m pretty sure this goes against the properties proven of entanglement (Bell test) and how far entanglement can propagate, but I don’t know enough about quantum mechanics to explain why this explanation is incompatible with entanglement.
If you don't know anything about the topic then maybe you shouldn't speak on it. Especially when claiming you have debunked peer reviewed papers from Harvard physicists like Jacob Barandes.
However, I don’t currently see how this at all explains computing with superpositions; if it’s just statistics a superposition can never exist
Superposition is a property of statistics. Even classical statics commonly represent the system's statistical state as a linear combination of basis states. That's just what a probability distribution is. If you take any courses in statistics, you will superimpose things all the time. This is a mathematical property.
so entanglement doesn’t exist; so quantum algorithms wouldn’t be possible, but we know they are.
Quantum advantage obviously comes from the phase of the quantum state. If you remove the phase from the quantum state then all you are left with is a probability distribution, and so there would be nothing to distinguish it from a classical statistical theory. But the phase is, again, a sufficient statistic over the system's history. The quantum advantage comes from the fact that you are ultimately operating with a much larger information space, since each instruction in the computer is a function over the whole algorithm's history back to the start of the quantum circuit, rather than just the current state of the computer's memory at that present moment.
I kinda boil it down to discreet energy packets distributed in an area as field values and the collapse occurs when two discreet packets interact
What if two packets interact with each other? If you claim a collapse occurs then entanglement could never happen, and so such a viewpoint is logically ruled out. If you say a collapse does not occur but only occurs if you introduce a measurement device, then this is vague without rigorously defining what a measurement device is, but providing any additional physical definition with then introduce something into the dynamics which is not there in orthodox quantum mechanics, so you've not moved into a new theory and are no longer talking about textbook QM.
Logic isn't MAEth.
Butt MAEth is logic. Don't misunderstand my meaning.
This language is literally cognitive shit.
Economics: Our findings are just as rigorous as these other sciences we swear!

Well spoken, no less.
I once called economics a pseudoscience in a reddit comment and some libertarian-capitalist type got suuuper butthurt about it.
He said I don't understand the word pseudoscience. I said, "no I understand it just fine. You don't understand economics."
His only response was to call that a "no, you" argument. Dunning-Kruger on full display.
Oh, I see I'm not the only one who views it that way. It's always nice to see some people who have professional credibility expressing a similar opinion.
Also, I didn't know the "Noble Prize in Economics" wasn't really a Nobel Prize at all (it's not awarded by the Nobel Foundation! They basically just appropriated the name...)
I always thought it was strange that there was one at all (or seemed to be one), and I didn't particularly like the credibility it seemed to lend to a field that doesn't deserve it, but it makes so much more sense now to know it's just a psyop run by a bank.
He was just a delusional living knot bot.
It's amazing how nonsensical the actual foundational axioms of modern day economics are.
Classical economics tried to tie economics to functions of physical things we can measure. Adam Smith for example proposed that because you can recursively decompose every product into the amount of physical units of time it takes to produce it all the way down the supply chain, then any stable economy should, on the average (not the individual case), roughly buy and sell in a way that reflects that time, or else there would necessarily have to be physical time shortages or waste which would lead to economic problems. We thus may be able to use this time parameter to make quantifiable predictions about the economy.
Many people had philosophical objections to this because it violates free will. If you can predict roughly what society will do based on physicals factors, then you are implying that people's decisions are determined by physical parameters. Humans have the "free will" to just choose to buy and sell at whatever price they want, and so the economy cannot be reduced beyond the decisions of the human spirit. There was thus a second school of economics which tried to argue that maybe you could derive prices from measuring how much people subjectively desire things, measured in "utils."
"Utils" are of course such ambiguous nonsense that eventually these economists realized that this cannot work, so they proposed a different idea instead, which is to focus on marginal rates of substitution. Rather than saying there is some quantifiable parameter of "utils," you say that every person would be willing to trade some quantity of object X for some quantity of object Y, and then you try to define the whole economy in terms of these substitutions.
However, there are two obvious problems with this.
The first problem is that to know how people would be willing to substitute things rigorously, you would need an incredibly deep and complex understanding of human psychology, which the founders of neoclassical economics did not have. Without a rigorous definition, you could not fit it to mathematical equations. It would just be vague philosophy.
How did they solve this? They... made it up. I am not kidding you. Look up the axioms for consumer preference theory whenever you have the chance. It is a bunch of made up axioms about human psychology, many of which are quite obviously not even correct (such as, you have to assume that the person has evaluated and rated every product in the entire economy, you have to assume that every person would be more satisfied with having more of any given object, etc), but you have to adopt those axioms in order to derive any of the mathematics at all.
The second problem is one first pointed out, to my knowledge, by the economist Nikolai Bukharin, which is that an economic model based around human psychology cannot possibly even be predictive because there is no logical reason to believe that the behavior of everything in the economy, including all social structures, is purely derivative of human psychology, i.e. that you cannot have a back-reaction whereby preexisting social structures and environmental factors people are born into shape their psychology, and he gives a good proof-by-contradiction that the back-reaction must exist.
The idea that you can derive everything based upon some arbitrary set of immutable mathematical laws made up in someone's armchair one day that supposedly rigorously details human behavior that is irreducible beyond anything else is just nonsense. No one has ever even tested any of these laws that supposedly govern human psychology.
it's also interesting how increasingly absurd economics gets the further it dissociates from reality.
people are dying
BUT THE DOW
Adam Smith was actually far more progressive than the neoliberal/capitalist propaganda like to portray him as. They basically cherry-pick his work and present it out of context to support arguments that are actually contrary to many of the points he was making...
When I said "classical economic theory" I meant more like "conventional economic theory," so encompassing the absurdities you mentioned here.
Like, they'll say "Economies naturally cycle through periods of growth and degrowth" to justify periods of inflation, but then when those periods of inflation are artificially extended to further enrich the shareholders (and artificially inflated, even!), they'll conveniently ignore the whole "periods of degrowth" side of the coin, and if anything even remotely has a chance of causing deflation, it's denounced as an anathema because "it would cause a recession!"
Corporations benefit from economies that harm consumers. Corporations should never be given control over economic policies. However, neoliberal economic policies are basically designed to help the corporations while hurting consumers. And it's all founded upon conventional economic theories.
That's how you end up with a Federal Reserve that says things like "Unemployment is a good thing, because if everyone has too much money to spend on things, it could cause inflation," yet never addresses the standard business practice of increasing prices while cutting costs all to make "number go up" so that the shareholder value increases each quarter and the C-suite can get bigger bonuses...
They say things like "We have to raise prices to keep up with inflation," but no, that's literally just contributing to artificial inflation, which is apparent when you look at their profit margins and how they've increased since 2020 when everyone started freaking out about inflation...
fucking computer science is going from on par with mathematics to worse than biology
"why do you guys do it that way"
"Look because if we don't sacrifice the goat on Thursday the code breaks, idk what to tell you"
turns out the thursday goat service brings in Dianne from networking who remembers she needs to reboot a apecific device weekly, but its not documented anywhere. When Dianne doesn't do this everyone freaks out and grabs another goat to sacririce which brings her back because who is she to say no to some good goat, and the cycle is continued and reinforced
Aye, Iris.
Engineering: We only care if it works, even if it breaks math/physics/chemistry/biology.
If pi is not exactly three why hasn't my bridge fallen?
Check mate mathematicians.
Physics: oh, and if you look close enough, it's actually all probability too.
or far away enough...

Bloodywood well regard that guy in Gaddaar.
Three months after broadcasting that song, WWE bought the UFC.
Co-Cain is the most hilarious joke ever planted around.
Economics: the law is true as long as people believe it's true.
Kind of like fairies, when you think about it
Meanwhile the mathematicians who got a bit too close Philosophy are still arguing about which logic to use and if a proof by contradiction is even a proof at all.
I'm a chemist, I just gave a class to students today. The main topic of the whole lesson was this: we have all these theories and methodologies, we are not going to study how they work and how to use them, let's discuss now all the limitations they have and when they do not work.