Advocates for clean energy, not nuclear
Against a technology with a lot of hype, that also no one wants
🤔😵💫
Drewelite
Everyone will take whatever advantage they can from any result. Apply previous sentence to any scenario.
If Apple (or any metaphorical creator you want to insert in here) doesn't want you using their product to make your movie, too bad. You bought their product. Even if millions of people end up watching your movie, they can't turn around and ask for any more. You acquired their product fairly like anybody else. Your transaction is done. If they don't like it, they should ask every person who's ever made or contributed to any version of the components in their device and see how they feel about it.
Now people using ChatGPT to impersonate artists shouldn't do that. But those individual people should be prosecuted. Nobody's confused that Andy Warhol might be quickly painting the pictures and sending them over in the DALL-E chat and you can't honestly make the argument that people aren't buying Stephen King books because they can type "Write me a Stephen King novel" into the prompt generator.
By that rationalization, OpenAI is paying their Internet bill, and for a copy of Dune, so they're free to use any content they acquired to make their product better. Your original argument wasn't akin to, "Shouldn't someone using an iPhone pay for one?" It was "Shouldn't Apple get a cut of everything made with the iPhone?"
You could make the argument that people use ChatGPT to churn out garbage content, sure, but a lot of cinephiles would accuse your proverbial indie movie of being the same and blame Apple for creating the iPhone and enabling it. If you want to make that argument, go ahead. But don't pretend it has anything to do with people getting paid fairly for what they made.
ChatGPT is enabling people to make more things, easier, to get paid. And people, as always, are relying on everything that was created before them as a basis for their work. Same as when I go to school and the professor shows me lots of different works to learn from. The thousands of students in that class didn't pay for any of that stuff. The professor distilled it and presented it and I paid him to do it.
I think what you're forgetting is that intelligence, in general, is an emergent property of recording information and learning what actions to take based on them. The current work on AI is essentially trying to take this evolutionary behavior, make it less random, and compress the cycles of iteration down so that intelligence emerges quickly. This whole argument "It's not smart like I'm smart" with only surface level observation about it's current state and no critical observation about how intelligence came to be, just sounds really insecure.
I get it. Humans will likely not be the smartest thing in the arena soon. But stating matter-of-factly that AI is inherently different is born from an emotional viewpoint. I understand there ARE differences, but no more so then how there are differences between a human and a dog. Which if you're honestly looking at the situation is impressively close to human intelligence in such a short time.
Fully agree. I understand why there are many technological doomers out there and I think AI may be the most deserving of a critical eye. But the immense benefits of being able to manufacture intelligence is undeniable. That NECESSITATES the AI being able to observe anything and everything in the world that it can. That's how any known intelligence has ever learned and there's no scientific basis for an intelligence coming into existence knowing everything about the world without it ever being taught about it.
Now I've heard a lot of criticism of AI. Some really legitimate concerns about their place in the future (and ours). As well as the ethics of this important technology originating in the private hands of mega corps that historically have not had our best interest at heart. But the VAST majority of criticism has been about how it's not useful or is just an avenue for copyright abuse. Which at best, is just completely missing the point. But at worst, is the thinly vailed protests of people made very uncomfortable that the status quo is being upset.
Perhaps this is the perspective people need.
I think your mentality is great. I've heard people say, "Sure I'll eat a burger, but what kind of psychopath wants to kill an animal themselves?"
I don't know, what kind of a psychopath pays an industry to do it for them so they don't have to feel bad about it? Look, I get it, I don't hunt. But I respect the people who respectfully end the animal's life themselves. Only they can really understand the cost. We just throw away some old chicken we forgot to cook while passing judgment on who we paid to get it for us and how they did it.
Do you not think AGI is possible?
People who hire writers, don't write their own words. You can say that human connection is a crucial part of the writing process. But I just honestly don't think that's true for the vast majority of things we write. But also, eventually AI will be indistinguishable, If not better, than a human writer.
When we hit AGI, if we can continue to keep open source models, it will truly take the power of the rich and put it in the hands of the common person. The reason the rich are so powerful is they can pay other people to do things. Most people only have the power to do what they can physically do in the world, But the rich can multiply that effort by however many people they can afford.
I'm sure wondering why they built that.