this post was submitted on 11 Oct 2024
76 points (68.3% liked)

Comic Strips

12569 readers
4234 users here now

Comic Strips is a community for those who love comic stories.

The rules are simple:

Web of links

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Hegar@fedia.io 15 points 1 month ago (2 children)

Increasing wealth has only ever been observed to fuel greater inequality.

I don't see any evidence that the value that increasing automation is bringing will be distributed more evenly.

We produce enough food for everyone and still let people starve - equal access to AI is even harder to justify than equal access to food.

[–] pennomi@lemmy.world 5 points 1 month ago (2 children)

I’m not so sure about that. When we compare medieval wealth inequality to now, it was worse back then. Ew, a link to Reddit, but it’s got good info.

Not saying we don’t need to fix things… we need to destroy even the concept of billionaires. While things are bad, and trending worse, they’re not yet “literally eat the rich” bad.

[–] Semjaza@lemmynsfw.com 6 points 1 month ago

Only when not looked at on a global scale (such as 1% owns 99% being assumed to be about the USA rather than global wealth). Mormengil's opening response is very feels over facts (also the claim that there was no state support for the poor can be technically true, but churches and local elite as well as royal dictat were often involved in poor relief and charity in the Middle Ages), the later response are better detailed.

And in the 1200s, global wealth inequality and access to food was for much of the world better or comparable to where it is now.

But global wealth inequality and access to food got worse after colonisation rearranged American and then African economies for European, and then USAian, benefit.

AI is already filled with implicit bias towards the current status quo. It can be very tricky to get AI chatbots to give anything other than platitudes about inequality, and they're often very quick to try to shut down or redirect talk of system change. To think that they'll not reflect continued post colonial and extractivist systems of power seems, to me, shortsighted.

[–] Hegar@fedia.io 4 points 1 month ago

I'm not sure that link does have good info.

That's a 0 point comment on ask historians, from 11 years ago, with no sources listed, no details and little explanation. The follow-up comments have a little more info but only from 1870, and even then it's only talking about land not wealth. Also the only source linked is a NY Review of Books article that 404s.

I think it's fairly safe to assume that wealth inequality was lower before industrialization. That really supercharges the power of capital, encouraging and rewarding larger and larger accumulations of capital. Before that it's also much harder to get reliable data.

Aristotle in the politics mentions a plan to cap wealth inequality at 1:5. Once you have more than 5 times the poorest citizen, your wealth is redistributed. He thinks it too radical, but could you imagine anyone talking about capping CEO pay at 5 times the janitor? That's unthinkable to us.

[–] JackGreenEarth@lemm.ee -4 points 1 month ago (1 children)

You would just have to let an superintelligent (aligned) AI robot loose and prompt it to produce enough food for everyone. It wouldn't even be any maintaining effort, once the robot had been created. If it doesn't have any negative consequences to the creators to have positive consequences for everyone else, and there are any empathetic people on the board of creators, I don't see why it wouldn't be programmed to benefit everyone.

[–] pennomi@lemmy.world 6 points 1 month ago (1 children)

As long as it doesn’t generate any negative externalities, sure. That’s a huge alignment problem though.

[–] JackGreenEarth@lemm.ee 0 points 1 month ago

True, and I have my doubts on the alignment problem being solved. But that's a technical problem, a separate conversation from whether even attempting it is worthwhile in the first place.