this post was submitted on 02 Mar 2024
204 points (89.2% liked)

Technology

59106 readers
3605 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
all 26 comments
sorted by: hot top controversial new old
[–] wonderfulvoltaire@lemmy.world 46 points 8 months ago (1 children)
[–] 1984@lemmy.today 13 points 8 months ago (3 children)

Someone said on hacker news that he is doing it to force open Ai to share their secrets.

[–] bionicjoey@lemmy.ca 39 points 8 months ago (2 children)

There's no real secret. They scraped the collected works of humanity with no regard for intellectual property rights and are now hiding behind the guise of being a "non-profit" company, despite raking in loads.

[–] JorMaFur@lemm.ee 3 points 8 months ago

I get the last part of your comment, where they make money from their AI's, but I'm curious if you could explain why them scraping everything on the internet is a bad thing? I'm really curious, as I see it more as an advantage that they could scrape just everything rather than a disadvantage: if we start getting LLMs that only scrape certain sites or certain topics, we're just making the echo chambers even more defined.

Them scraping everything feels a bit like the internet era where APIs were open and semi-unlimited.

I do get that them having used , for example, all the Tolkien books as part of their dataset, and users being able to ask their tools to "write a new LOTR" book, could be seen as a problem, but even then it's only a problem when they commercialise it, and we have laws in place for those things. Fanfics have always been a thing and are just the same, except that it takes more effort as it's a human writing it.

Again: I'm really curious about your view and opinion, as you can see that mine is quite different.

[–] 1984@lemmy.today 0 points 8 months ago

There is a secret somewhere, because they are doing a lot better than Microsoft and Google. :)

[–] tabular@lemmy.world 8 points 8 months ago* (last edited 8 months ago) (1 children)

Might be easier to force them to change their name if they ain't "open" (source) anymore /s

[–] hemko@lemmy.dbzer0.com 3 points 8 months ago

Not gonna lie, their name is a huge annoyance for me. They're about as open as Bill Gates is a philanthropist

[–] hemko@lemmy.dbzer0.com 45 points 8 months ago (2 children)

Yeah well can't really argue with that

it fundamentally accuses OpenAI and its CEO, Sam Altman, of pretending to run a nonprofit designed to benefit humanity while actually running a regular ol’ tech company and trying to make a lot of money.

[–] SlopppyEngineer@lemmy.world 44 points 8 months ago (1 children)

Coming from the same guy that promotes free speech by banning those that disagree with him.

[–] hemko@lemmy.dbzer0.com 23 points 8 months ago (1 children)

Yeah there's some serious hypocrisy there but honestly I don't mind a him poking at openai the slightest, actually I'm happy about it. If he happens to burn some good money in that lawsuit without establishing anything, that doesn't hurt anyone either

[–] mindlight@lemm.ee 2 points 8 months ago

Funny thing, he's not only spending his own money. He's burning the taxpayer's money too.

All these useless lawsuits takes up important resources but lucky us that Elon has a taxable income....

[–] Imgonnatrythis@sh.itjust.works 4 points 8 months ago

As a statement you can't, but as an explanation of breach of contract, a first year law student certainly could.

[–] Hestia@lemmy.world 43 points 8 months ago* (last edited 8 months ago) (3 children)

Read a bit of the court filing, not the whole thing though since you get the gist pretty early on. Jornos put spin on everything, so here's my understanding of the argument:

  1. Musk, who has given money to OpenAI in the past, and thus can legally file a complaint, states that
  2. OpenAI, which is a registered as an LLC, and which is legally a nonprofit, and has the stated goal of benefitting all of humanity has
  3. Been operating outside of its legally allowed purpose, and in effect
  4. Used its donors, resources, tax status, and expertise to create closed source algorithms and models that currently exclusively benefit for-profit concerns (Musk's attorney points out that Microsoft Bing's AI is just ChatGPT) and thus
  5. OpenAI has created a civil tort (a legally recognized civil wrong) wherein
  6. Money given by contributors would not haven been given had the contributors been made aware this deviation from OpenAI's mission statement and
  7. The public at large has not benefited from any of OpenAI's research, and thus OpenAI has abused its preferential tax status and harmed the public

It's honestly not the worst argument.

[–] Shelena@feddit.nl 16 points 8 months ago* (last edited 8 months ago)

I actually agree with this. This technology should be open. I know that there are arguments to keep it closed, like it could be misused, etc. However, I think that all the scary stories about AI are also a way to keep attention away from the fact that if you have a monopoly on it, you have enormous power. This power will grow when the tech is used more and more. If all this power is in the hands of a commercial business (even though they say they aren't), then you know AI is going to be misused to gain money. We do not have clear insight in what they are doing and we have no reason to trust them.

You also know that bad actors, like dictatorial governments will eventually get or develop the technology themselves. So, keeping it closed is not a good way to protect it from that happening. At the same time, you are also keeping it from researchers who could investigate how to use and develop it further to be used responsibly and to the benefit of humanity.

Also, they relied on data generated by people in society who never got any payment or anything for that. So, it is immoral to not share the results with that same people in society openly and instead keeping it closed. I know they used some of my papers. However, I am not allowed to study their model. Seems unfair.

The dangers of AI should be kept at bay using regulation and enforcement by democratically chosen governments, not by commercial businesses or other non-democratic organisations.

[–] 1984@lemmy.today 15 points 8 months ago

I don't want Musk to be right, but I have to admit, it sounds legit.

[–] conciselyverbose@sh.itjust.works 9 points 8 months ago

Yeah, fuck "it's not in the terms of a contract". It's fraud.

You can't advertise yourself as a nonprofit organization for the public good, collect donations under that pretense, then just privatize anything you learn for profit.

People don't donate to for profit companies.

[–] kokesh@lemmy.world 8 points 8 months ago (1 children)

Who would expect that... His madness is progressing visibly in front of everyone's eyes

[–] Imgonnatrythis@sh.itjust.works 3 points 8 months ago

This isn't madness it's just attention seeking and stupidity. You know when you get a dumb idea and someone tells you no that's a dumb idea or you just eventually realize it is? Imagine if you have a dumb idea and everyone around you just tells you it's a great idea and also you have an inflated ego and nothing you do has any meaningful consequences at all. Also everyone around you stands to potentially profit from your dumbness so they really egg you on. You start to look the fool quite quickly.

[–] NOT_RICK@lemmy.world 4 points 8 months ago

Space Karen the SLAPP Lord