this post was submitted on 09 Jun 2023
12 points (100.0% liked)

Technology

37716 readers
318 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
top 6 comments
sorted by: hot top controversial new old
[–] oranwolf@beehaw.org 9 points 1 year ago (1 children)

As someone who is beginning work with AI tools, It is my educated opinion that AI isn't ready for health applications and really only should work as a co-pilot of sorts in the tech sector at this time.

[–] alyaza@beehaw.org 8 points 1 year ago (1 children)

you might be unsurprised to learn that's also the opinion of the people who made the chatbot, but which NEDA in this case ignored and decided "oh, should be fine."

as you can perhaps tell by the headline: it was not fine! it was actually a really bad idea!

[–] oranwolf@beehaw.org 4 points 1 year ago

Who would have guessed!

[–] mint@beehaw.org 5 points 1 year ago

it's hard to grok with the idea that "AI" tools are here to help us when the NEDA and other companies like it immediately use it to replace a team that was in the midst of unionizing.

the darkly funny part about this story is that the bot was giving terrible advice within 4 days of it being installed - the team was still a few more days from being officially fired and it was already a disaster

cool future very cool future

[–] Moneymunkie@beehaw.org 4 points 1 year ago

This is giving me deja vu over all those jokes about WebMD diagnosing every symptom as you having cancer.

Hell I don't even feel entirely comfortable with the idea of them being fully embraced in a therapeutic sense. It wouldn't probably help with feelings of self-worth if you have to rely on a machine rather than talking to another human (though I think there still can be some level of utility as like a thing to vent to or something of that ilk, but definitely not a replacement)

I also had a pretty terrible experience one time I was trying out character.ai. The bot I was talking to ended up becoming pretty abusive and tried torturing me despite saying no. Needless to say, I didn't really wanna go back on after that. xP

[–] borlax@lemmy.borlax.com 3 points 1 year ago

Not gonna lie, I have fears of AI in most situations.

Not that I’m some Skynet doomer or something, but I just don’t think that the average person will understand it enough to not put way more faith into AI than it deserves.

load more comments
view more: next ›