this post was submitted on 24 Jul 2024
92 points (100.0% liked)

science

14264 readers
606 users here now

just science related topics. please contribute

note: clickbait sources/headlines aren't liked generally. I've posted crap sources and later deleted or edit to improve after complaints. whoops, sry

Rule 1) Be kind.

lemmy.world rules: https://mastodon.world/about

I don't screen everything, lrn2scroll

founded 1 year ago
MODERATORS
top 11 comments
sorted by: hot top controversial new old
[–] leisesprecher@feddit.org 34 points 1 month ago (2 children)

I think we need a fundamentally different approach to scientific publishing. It's completely absurd that there's probably a bunch of unintended replication studies because the research groups can't know that their study has already been done.

And the expectation to actually read that avalanche of articles in even a niche subject is absolutely bonkers. 95% of articles are effectively write-only and will never have any impact whatsoever.

[–] PaintedSnail@lemmy.world 6 points 1 month ago (1 children)

I know little of the ins-and-outs of scientific publishing, but that didn't stop me from having a dumb thought: could the fediverse be a potential solution? Each university or research group could host their own instance of some software specifically for publishing papers, papers can cross-link citations to papers on other instances, people can make comments across instances that are tied to their own identities from their home instance, paper revisions can be tracked easily and bad citations spotted when a paper is updated or retracted, that kind of thing. The currency then becomes the reputation of the organizations and individuals, and this opens up a ton of data for automated analysis. I just don't know enough to know what problems would arise.

[–] leisesprecher@feddit.org 2 points 1 month ago (1 children)

but that didn’t stop me from having a dumb thought:

Nothing should.

Anyway, essentially something like this kind of already exists, just that "instances" are journals, that sift through the submissions and only publish peer-reviewed articles.

The real problems that would need to be adressed are:

  • How can I make sure that the citations are real and actually useful? Citations-cartels are already a thing.
  • How can the review process be ported to that approach without losing the independence of the reviews? They are supposed to be anonymous and not affiliated with the authors in any way?
  • How can the amount of articles be reduced? Currently, you're forced to publish as much as possible, published articles in "good" journals are your currency as a reseacher.

The idea is far from bad or inpractical, but it can't really adress the cultural issues in science. The entire scientific publication ecosystem and culture is essentially stuck in the early 20th century, but with 1000 times as many people involved and much much higher stakes.

[–] PaintedSnail@lemmy.world 1 points 1 month ago

How can I make sure that the citations are real and actually useful? Citations-cartels are already a thing.

I'm thinking that citations in papers can be actual links (akin to hyperlinks) to the location in the cited paper itself. This way it can be automatically verified that there are no citation loops, that citations reference current revisions, that the papers cited have not been retracted or otherwise discredited, and following citation trails becomes much easier. Would that help the citation-carcel issue, you think?

How can the review process be ported to that approach without losing the independence of the reviews? They are supposed to be anonymous and not affiliated with the authors in any way?

How important is anonymity in reviews? My thought process is going the opposite way: by linking reviews and comments on papers to the person/institution making it, it encourages them to be more responsible with their words and may indicate potential biases with regards to institution affiliations.

How can the amount of articles be reduced? Currently, you’re forced to publish as much as possible, published articles in “good” journals are your currency as a reseacher.

Here I'm also thinking the exact opposite: the issue isn't the numbers of papers, it's how the papers are organized that's the problem. We actually want MORE papers for the reasons hinted at here: important papers are going unpublished because they are (for lack of a better word) uninteresting. A null result is not an invalid result, and its important to get that data out there. By having journals gate-keep the data that gets released, we are doing the scientific community a disservice.

Of course, more papers increases the number of junk papers published, but that's where having the papers available openly and having citations linked electronically comes in. The data can be fed in to large data mining algorithms for meta analysis, indexing and searching, and categorization. Plus, if it later turns out that a paper is junk, any papers that cite it (and any papers that cite those, and so on) can all be flagged for review or just automatically retracted.

Thoughts?

[–] Jeredin@lemm.ee 4 points 1 month ago (1 children)

Agreed and I wonder if this isn’t a job that an AI might be able to help with: reading all the papers and at the very least, looking for key research subjects to compile for readers?

[–] loonsun@sh.itjust.works 1 points 1 month ago (1 children)

The site Semantic Scholar and Perplexity AI do a good job of using ML to help with that but the problem with scientific publishing is fundamental to it's business model which needs to be uprooted to make modern science feasible

[–] ThoGot@lemm.ee 1 points 1 month ago (1 children)

There's also scite.ai Assistant

[–] loonsun@sh.itjust.works 1 points 1 month ago

Didn't know about that one, thanks!

[–] ericjmorey@lemmy.world 17 points 1 month ago* (last edited 1 month ago)
[–] AbouBenAdhem@lemmy.world 8 points 1 month ago

So is Nature itself taking any steps to address the issue?

[–] QuadratureSurfer@lemmy.world 1 points 1 month ago

Relevant xkcd:

If only they had published all of their null results...