this post was submitted on 20 Nov 2023
96 points (99.0% liked)
Technology
37727 readers
653 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This soap opera just got another twist! It's like clockwork, considering how Microsoft has invested before in OpenAI and Bing uses their technology.
I’d say this is an amazing result for MS. Not only is their investment mostly Azure credits, so OpenAI is dependent on MS, now they also got Altman and his followers for themselves for more research.
Good for MS, bad for humanity. I believe one of the worst possible timelines for the average human is one where for-profit capitalist entities control access to AGI and horde all the benefits accrued from it. OpenAI was founded specifically to avoid this - with a complicated governance structure designed to ensure true AGI would end up owned by humanity with the benefits shared by all.
The OpenAI board and top researchers made a desperate bid to prioritize safety over profits, and even with that elaborate governance structure behind them capitalism still seems to have found a way to fuck us.
Today we saw Satya Nadella and Sam Altman steer humanity further from a possible utopia and closer to.... Cyberpunk 2077.
Good luck everyone!
Don't be too worried about AGI being a thing in the short. And the only thing which I find to suck with respect to consolidation is that contemporary AI requires a lot of hardware thrown at it while cloud services (providing this hardware on demand) are practically the same triopoly. That sucks if you want to be the next AI startup. But academia is mostly unaffected, and far from lagging behind (multiple open source LLMs are compelling alternatives to chatgpt and not benefitting from OpenAI's millions of marketing and hype doesn't make them less valuable)
Fair enough, but the short term track we're on is still a hellish dystopia. For the societal damage I'm worried about to happen, we don't really need AGI as you are probably defining it. If we use the OpenAI definition for AGI, "systems that surpass human capabilities in a majority of economically valuable tasks", I'd argue that the technology we have today is practically there already. The only thing holding back the dystopia is that corporate America hasn't fully adapted to the new paradigm.
Imagine a future where most fast food jobs have been replaced by AI-powered kiosks and drive-thrus.
Imagine a future where most customer service jobs have been replaced by AI-powered video chat kiosks.
Imagine a future where most artistic commission work is completed by algorithms.
Imagine a future where all the news and advertising you read or watch is generated specifically to appeal to you by algorithms.
In this future, are the benefits of this technology shared equitably so that the people who used to do these jobs can enjoy their newfound leisure time? Or will those folks live in poverty while the majority of their incomes are siphoned off to the small fraction of the populace which are MS investors?
I think we all know the answer to that one.
To help you out with the monopolistic/capitalist concern: https://simonwillison.net/2023/May/4/no-moat/
tl;dr: OpenAI's edge with ChatGPT is essentially minor (according to the people from within), and the approach of building ever larger and inflexible models is challenged by (technologically more accessible and available) smaller and more agile models
Funny you bring this one up :)
https://marshallbrain.com/manna1
To a large extent, we have been there for a long time:
https://www.youtube.com/watch?v=7Pq-S557XQU
This, and the theory of bullshit jobs:
https://strikemag.org/bullshit-jobs/
were formative reads to me.
The end-game is pretty clear: we have reached the limits to the model on which our current society is built (working jobs to earn money to spend money to live). We now have excess supply of the essential goods to sustain lives and scarcity of jobs at the same time. We will have soon to either accept that working isn't a mean to an end (accept universal basic income and state interventionism), or enter a neofedalism era where resources are so consolidated that the illusion of scarcity can be maintain and justify the current system (which essentially the bullshit-jobs is all about).
It's perhaps the most important societal reform our species will know, and nobody's preparing for it :)
This is already the case today:
https://en.wikipedia.org/wiki/Filter_bubble
And this is already weaponized (e.g. TikTok's algorithm trying to steer the youth towards education and science in China and towards … something completely different in the rest of the world).
@u_tamtam
It doesn't really matter if Microsoft/OpenAI are the only ones with the underlying technology as long as the only economically feasible way to deploy the tech at scale is to rely on one of the big 3 cloud providers (Amazon, Google, Microsoft). The profits still accrue to them, whether we use a larger/inflexible or smaller/flexible model to power the AI - the most effective/common/economical way for businesses to leverage it will be as an AWS service or something similar.
Are you saying you're cool with neofeudalism? Or just agreeing that this is yet another inevitable (albeit lamentable) step towards it?
Yup, but as the "no moat" link I posted implied, at least for LLMs, it might not be required to spend very much in hardware to be almost as good as ChatGPT, so that's some good news.
Oh, crap, no, sorry if I wasn't clear. I believe we are at the crossroads with not much in the middle between our society evolving into extensive interventionism, taxation and wealth redistribution (to support UBI and other schemes for the increasingly large unemployable population) or neufeudalism. I don't want billionaires and warlords to run the place, obviously. And I'm warry about how the redistribution would go with our current politicians and the dominant mindset associating individual merit to wealth and individualistic entrepreneurship.