this post was submitted on 23 Nov 2024
369 points (99.2% liked)

Technology

59598 readers
3769 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

SafeRent is a machine learning black box for landlords. It gives landlords a numerical rating of potential tenants and a yes/no result on whether to rent to them.

In May 2022, Massachusetts housing voucher recipients and the Community Action Agency of Somerville sued the company, claiming SafeRent gave Black and Hispanic rental applicants with housing vouchers disproportionately lower scores.

The tenants had no visibility into how the algorithm scored them. Appeals were rejected on the basis that this was what the computer output said.

top 22 comments
sorted by: hot top controversial new old
[–] 11111one11111@lemmy.world 2 points 53 minutes ago* (last edited 42 minutes ago)

-Edit:

Adding Edit to the beginning to stop the replies from people who read the scenario for context and can't fight their compulsion to reply by nitpicking my completely made up list of "unbiased" metrics. To these peeps I say, "Fucking no. Bad dog. No!" I don't fucking care about your commentary to a quickly made up scenario. Whatever qualms you have, just fuckin change the imaginary scenario so it fits the purpose of what the purpose of the story is serving.

-Preface of actual comment:

Completely made up scenario to give context to my question. This is not me defending anything referenced to the article.

-Actual scenario with read, write, edit permissions to all users:

What if the court order the release of the AI code and training methods for this tenant analysis AI bot and found the metrics used were simply credit score, salary, employer and former rental references. No supplied data for race, name, background check or anything else that would tip the boy toward or away from any bias results. So this pure as it could be bot still produces the same results as seen in the article. Again, imaginary scenario that is likely no foundation of truth.

-My questions for the provided context:

  1. Are there studies that compare methods of training LLMs with results showing differences in results ranging from less or no racist bias and more racist bias?

  2. Are there ways of training LLMs to perform without bias or is the problem with the LLM's code and no matter how you train them there will always be a bias presence?

  3. In the exact imaginary scenario, would the pure, unbiased angel version of rhe AI bot but producing equally racist results as biased trained AI bots see different court rulings that the AI that shows it's flawed design caused the biased results?

-I'm using bias over racist to reach broader area beyond race related issues. My driving purposes is:

  1. To better understand how courts are handling AI related cases and if they give a fuck about the framework and design of the AI or if none of that matters and the courts are just looking at the results;

  2. Wondering if there are ways to make or already made LLMs that aren't biased and what about their design makes them biased, is it the doing of the makers of the LLM or is it the training and implication of the LLM by the enduser/training party that is to blame?

[–] Fizz@lemmy.nz 10 points 5 hours ago (2 children)

Just do you job you lazy cunts. Stop trying get ai to do everything. Real estate agents should be checking this stuff it's part of the role.

[–] nandeEbisu@lemmy.world 2 points 21 minutes ago

But AI is so useful for laundering racism, sexism, and IP theft with plausible deniability.

[–] AngryCommieKender@lemmy.world 3 points 55 minutes ago

Residential renters generally don't use a real estate agent.

[–] kittenzrulz123@lemmy.blahaj.zone 26 points 12 hours ago (2 children)

$2.28 million is pocket change for the capitalists, they will learn nothing

[–] LANIK2000@lemmy.world 7 points 5 hours ago

Just another cost of running business.

[–] scratchee@feddit.uk 5 points 7 hours ago

At least they were banned from using AI screening for 5 years.

I’d hope breaking a court order would result in the kind of punishments they would actually fear.

[–] AlecSadler@sh.itjust.works 15 points 12 hours ago

Crappy-ass fine and simply a "cost of doing business" for them I bet. Damages have been done for which there is no undoing. Deplorable.

[–] drdiddlybadger@pawb.social 97 points 22 hours ago (1 children)

The land lords who used the service should also be held liable. You mean to tell me you get a report with a binary answer and you just trust it with no due diligence? If there is no penalty for blindly trusting an algorithm they will just move to the next tool they can use to be bigots.

[–] Randomgal@lemmy.ca 35 points 16 hours ago

Boy wait until you hear about credit scores...

[–] cheese_greater@lemmy.world 42 points 23 hours ago* (last edited 23 hours ago) (1 children)

If there are suicides linked to wronged applicants, they should be charged with at least "involuntary" manslaughter

[–] IDKWhatUsernametoPutHereLolol@lemmy.dbzer0.com 15 points 22 hours ago (4 children)

How do you criminally charge an organization? Like who's reponsible? CEO? Stockholders? The Board of Directors?

[–] Squizzy@lemmy.world 2 points 1 hour ago

In order to be a director of a business you have to assume the legal responsibility of the organisation. You need more than 1 director and ignorance is not an excuse, there are expectations of awareness and involvement for anyone legally in a director role.

[–] IcyToes@sh.itjust.works 5 points 13 hours ago

Well stockholders don't have executive capabilities. The CEO is responsible. Could hold board responsible too if they knew.

[–] Zak@lemmy.world 34 points 22 hours ago (2 children)

Here's an explanation from the Associated Press. The penalty is usually a fine, which impacts stockholders by making the stock less valuable and could lead them to remove board members or demand the termination of executives. It's rarely used, but there is a corporate death penalty.

The penalty is usually a fine, which impacts stockholders by making the stock less valuable

Of course they can always compensate for this by firing a bunch of people.

[–] otacon239@lemmy.world 21 points 22 hours ago (1 children)

The fact that I’ve never heard of the corporate death penalty until now, but they’re bringing back the actual death penalty says everything.

[–] mriguy@lemmy.world 14 points 18 hours ago

"I'll believe corporations are people when Texas executes one."

[–] SeaJ@lemm.ee 4 points 16 hours ago

You could revoke their corporate charter.

[–] sbv@sh.itjust.works 27 points 23 hours ago

Now they're promising to only be pretty racist.

[–] sunzu2@thebrainbin.org 15 points 22 hours ago

OK some people got paid.. The problem didn't get solved

Classic america

[–] uis@lemm.ee 11 points 23 hours ago

There was lecture by Cory Doctrow about it.