this post was submitted on 09 Jul 2025
583 points (98.8% liked)

Science Memes

15697 readers
2402 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] lime@feddit.nu 147 points 1 day ago (21 children)

hey if the reviewers don't read the paper that's on them.

[–] sga@lemmings.world 112 points 1 day ago (20 children)

often this stuff is added as white text (as in, blends with backround), and also possibly placed behind another container, such that manual selection is hard/not possible. So even if someone reads the paper, they will not read this.

[–] fullsquare@awful.systems 6 points 1 day ago (1 children)

maybe it's to get through llm pre-screening and allow the paper to be seen by human eyeballs

[–] sga@lemmings.world 5 points 1 day ago (1 children)

that could be the case. but what I have seen my younger peers do is use these llms to "read" the papers, and only use it's summaries as the source. In that case, it is definitely not good.

[–] fullsquare@awful.systems 4 points 1 day ago (1 children)

in one of these preprints there were traces of prompt used for writing paper itself too

[–] sga@lemmings.world 1 points 17 hours ago (1 children)

you would find more and more of it these days. people who are not good in the language, or not in subject both would use it.

[–] fullsquare@awful.systems 2 points 12 hours ago (1 children)

if someone is so bad at a subject that chatgpt offers actual help, then maybe that person shouldn't write an article on that subject in the first place. the only language chatgpt speaks is bland nonconfrontational corporate sludge, i'm not sure how it helps

[–] sga@lemmings.world 1 points 10 hours ago

What I meant was for example, if someone is weak in, let's say, english, but understands their shit, then they conduct their research however they do, and then have some llm translate it. that is a valid use case to me.

Most research papers are written in English, if you need international cites, collaboration or accolades. A person may even speak english but it is not good enough, or they spell bad. But then the llm is purely a translator/grammar checker.

But there are people who use it to do the latter, use it to generate stuff, and that is bad imo

load more comments (18 replies)
load more comments (18 replies)