this post was submitted on 14 Mar 2024
632 points (98.6% liked)

Science Memes

11047 readers
3104 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Wirlocke@lemmy.blahaj.zone 1 points 8 months ago (1 children)

Typically for the AI to do anything useful you'd copy and paste the medical records into it, which would be patient data.

Technically you could expunge enough data to keep it inline with HIPPA, but if there's more people careless enough not to proofread their paper, then I doubt those people would prep the data correctly.

[–] survivalmachine@beehaw.org 2 points 8 months ago (1 children)

ChatGPT has no burden to respect HIPAA in that scenario. The medical provider inputting your PHI into a cloud-based LLM is violating your HIPAA rights in that case.

[–] Wirlocke@lemmy.blahaj.zone 2 points 8 months ago

Just to clarify I am implying the medical provider would be the one sued. I didn't think ChatGPT would be in the wrong.

ChatGPT has just done a great job revealing how lazy and poorly thought out people are all over.