this post was submitted on 19 Nov 2023
498 points (87.1% liked)

Technology

59106 readers
4500 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Barack Obama: “For elevator music, AI is going to work fine. Music like Bob Dylan or Stevie Wonder, that's different”::Barack Obama has weighed in on AI’s impact on music creation in a new interview, saying, “For elevator music, AI is going to work fine”.

you are viewing a single comment's thread
view the rest of the comments
[–] BedSharkPal@lemmy.ca 42 points 11 months ago (4 children)

There is no way this ages well.

[–] otter@lemmy.ca 32 points 11 months ago* (last edited 11 months ago) (2 children)

I think the statement was more about the impact, which will depend on each person's subjective experience

Personally I agree. Even if AI could produce identical work, the impact would be lessened. Art is more meaningful when you know it took time and was an expression/interpretation by another human (rather than a pattern prediction algorithm Frankenstein-ing existing work together). Combine that with the volume of AI content that's produced, and the impact of any particular song/art piece is even more limited.

[–] 5BC2E7@lemmy.world 9 points 11 months ago* (last edited 11 months ago)

I'd say art is more meaningful when it's a unique experience. It's like those myths about glassmakers being ~~killed~~ blinded after the cathedral is finnished so that no one can replicate the glass color... without the killing.

[–] Even_Adder@lemmy.dbzer0.com 4 points 11 months ago

People are social, if enough people feel the same way about one thing it'll succeed. It doesn't matter where it came from or how it was made, like how people can still admire and appreciate nature. Or maybe the impact will be that it reduces all impacts. Every group and subgroup might be able to have their own thing.

[–] gregorum@lemm.ee 14 points 11 months ago* (last edited 11 months ago) (1 children)

I don’t know. I think Obama kind of nailed it. AI can create boring and mediocre elaborations just fine. But for the truly special and original? It could never.

For the new and special, humans will always be required. End of line.

[–] kromem@lemmy.world 7 points 11 months ago (1 children)

At this point I want a calendar of at what date people say "AI could never" - like "AI could never explain why a joke it's never seen before is funny" (such as March 2019) - and at what date it happens (in that case April 2022).

(That "explaining the joke" bit is actually what prompted Hinton to quit and switch to worrying about AGI sooner than expected.)

I'd be wary of betting against neural networks, especially if you only have a casual understanding of them.

[–] rambaroo@lemmy.world 2 points 11 months ago (1 children)

I mean the limitations of LLMs are very well documented, they aren't going to advance a whole lot more without huge leaps in computing technology. There are limits on how much context they can store for example, so you aren't going to have AIs writing long epic stories without human intervention. And they're fundamentally incapable of originality.

General AI is another thing altogether that we're still very far away from.

[–] kromem@lemmy.world -1 points 11 months ago (1 children)

Nearly everything you wrote is incorrect.

As an example, rolling context windows paired with RAG would easily allow for building an implementation of LLMs capable of writing long stories.

And I'm not sure where you got the idea that they were fundamentally incapable of originality. This part in particular tells me you really don't know how the tech is working.

[–] rambaroo@lemmy.world 4 points 11 months ago* (last edited 11 months ago) (1 children)

A rolling context window isn't a real solution and will not produce works that even come close to matching the quality of human writers. That's like having a writer who can only remember the last 100 pages they wrote.

The tech is trained on human created data. Are you suggesting LLMs are capable of creativity and imagination? Lmao - and you try to act like I'm the one who's full of shit.

[–] kromem@lemmy.world -1 points 11 months ago

That's like having a writer who can only remember the last 100 pages they wrote.

That's why you pair it with RAG.

The tech is trained on human created data. Are you suggesting LLMs are capable of creativity and imagination?

They are trained by iterating through network configurations until there's diminishing returns on how accurately they can complete that human created data.

But they don't just memorize the data. They develop the capabilities to extend it.

So yes, they absolutely are capable of generating original content that's not in the training set. As has been demonstrated over and over. From explaining jokes not found in the training data, solving riddles not found in it, or combining different concepts to result in a new synthesis not found in the original data.

What do you think it's doing? Copy/pasting or something?

[–] Knusper@feddit.de 11 points 11 months ago (1 children)

I think, it will eventually become obsolete, because we keep changing what 'AI' means, but current AI largely just regurgitates patterns, it doesn't yet have a way of 'listening' to a song and actually judging whether it's good or bad.

So, it may expertly regurgitate the pattern that makes up a good song, but humans spend a lot of time listening to perfect every little aspect before something becomes an excellent song, and I feel like that will be lost on the pattern regurgitating machine, if it's forced to deviate from what a human composed.

[–] TopRamenBinLaden@sh.itjust.works 1 points 11 months ago* (last edited 11 months ago) (1 children)

I have seen a couple successful artists in different genres admit to using AI to help them write some of their most popular songs, and describe it's use in the songwriting process. You hit the nail on the head with AI not being able to tell if something is good or bad. It takes a human ear for that.

AI is good at coming up with random melodies, chord progressions, and motifs, but it is not nearly as good at composing and producing as humans are, yet. AI is just going to be another instrument for musicians to use, in its current form.

[–] Knusper@feddit.de 1 points 11 months ago

Yeah, I do imagine, it won't be just AIs either. And then, it will obviously be possible to take it to an excellent song, given enough human hours invested.

I do wonder, how useful it will actually be for that, though. Often times, it really fucks you up to try to go from good to excellent and it can be freeing to start fresh instead. In particular, 'excellent' does require creative ideas, which are easier for humans to generate with a fresh start.
But AI may allow us to start over fresh more readily, if it can just give us a full song when needed. Maybe it will even be possible to give it some of those creative snippets and ask it to flesh it all out. We'll have to see...

[–] takeda@lemmy.world 4 points 11 months ago (1 children)

As someone who is doing software engineering and my company jumped on AI bandwagon and got us GitHub Copilot. After using it for a while I think overall experience is actually net negative. Yes, sometimes it gets things right, sometimes it provides a correct solution, but often I can write much more concise code. Many times it provides code that looks like it is correct, but after looking in more detail it actually is wrong. So now I'm need to be in guard what code it inserts, which kills all the time that it supposedly saved me. It makes things harder because the code does look like it might work.

It is like pair programming with a complete moron that is very good at picking patterns and trying to use them in following code. So if you do a lot of copy and paste I think it will help.

I think this technology can make bad programmers suck less at programming. I think the LLM problem is that it was trained with existing works and the way it works is that its goal is to convince other human that the result was created by another one, but it isn't capable to do any actual reasoning.

[–] TrickDacy@lemmy.world -2 points 11 months ago* (last edited 11 months ago) (4 children)

Wow, my experience has been pretty much the exact opposite of this. Copilot is amazing and I'd rather not go without it ever again

Edit: for the life of me I'll never understand people. This comment got a bunch of downvotes and yet some douchebag who blindly accuses me of being bad at my job gets upvoted. Fuck people.

[–] takeda@lemmy.world 9 points 11 months ago* (last edited 11 months ago) (1 children)

What language you program in and what kind of code you develop? Before Copilot were you frequently searching answers on stackoverflow?

[–] TrickDacy@lemmy.world 3 points 11 months ago

Typescript, JavaScript, php, bash, scss/css... And isn't every dev on SO or at least a search engine with some frequency?

I don't actually think the reason I like it is dependent on the language at all. The reason I like it is that it will often basically notice what I'm doing and save me from typing a repetitive 3-5 line block. Things like that and if I can't remember a specific syntax, I've found that I can write a comment saying what the following code will do and boom, suddenly copilot writes a version of that code close to what I would've written.

I mean you're right that it can write stuff that doesn't work, I just find that I can usually filter that out pretty quickly. The times I can't, I'm a bit stuck anyway and it's worth a shot to try their mysterious solution. But since I always treat its solutions with skepticism I haven't been bitten yet.

For me, copilot just takes the monotony out of the job. Instead of spending as much time writing boring stuff I get to focus on the more interesting parts

[–] jackie_jormp_jomp@lemm.ee 5 points 11 months ago (1 children)

Maybe you aren't that good at writing code

[–] TrickDacy@lemmy.world 3 points 11 months ago

Maybe you aren't that good at being a human, this comment being good evidence of that

[–] interceder270@lemmy.world 3 points 11 months ago (1 children)

Ignore them. At some point you gotta realize most people are losers trying to bring others down with them.

Do what works for you :)

[–] TrickDacy@lemmy.world 1 points 11 months ago

I appreciate this comment. You inspire me to not only ignore more assholes, but maybe I'll also be one myself less often :)

[–] deur@feddit.nl 2 points 11 months ago (1 children)

Ill blindly accuse you of being bad at your job too, bud.

[–] TrickDacy@lemmy.world -4 points 11 months ago

Thanks for block request. Appreciate reducing douchebags in life