this post was submitted on 01 Feb 2024
224 points (93.1% liked)

Technology

59582 readers
3910 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] homesweethomeMrL@lemmy.world 56 points 9 months ago (2 children)

. . . NVIDIA in-house counsel Nikki Pope: In a panel on “governing” AI risk, she cited internal research that showed consumers trusted brands less when they used AI.

This gels with research published last December that found only around 25 percent of customers trust decisions made by AI over those made by people. One might think an executive with access to this data might not want to admit to using a product that would make people trust them less.

Indeed.

[–] Bigmouse@lemmy.world 14 points 9 months ago

25% is an abnormally large number considering the current techonological inability to the same thing as a human could. In my experience current "AI" is mostly useful for very specific tasks with very narrow guidelines.

[–] kromem@lemmy.world 3 points 9 months ago

What's interesting is the research where when humans don't know that the output is generated by an AI, they prefer and trust it more than output from actual humans.