this post was submitted on 13 Sep 2023
350 points (96.8% liked)
Technology
59414 readers
3442 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Because photographs don't require other people photographs to work. It just requires the labour of the engineers at Nikon and you payed them by buying the camera.
Use an AI algorithm with no training set and see how good your tool is.
What if I used an open source algo with my own photographs as a dataset π€
I don't see why you wouldn't be able to keep copyright then. Everything involved would have been owned by you.
That is a big difference to how other generative models work though, which do use other people's work.
Because you would have to prove that the AI only learned from your work and itβs my understanding that there is no way to track what is used as learning material or even have an AI unlearn something.
The people that is stealing art designed their algorithm to not contain proof that they stole art. If they are legally required to prove what training data they used in order to get a copyright then they will design the AI around that. That would immediately disqualify most of the current AIs because they have all been fed stolen art but I am sure they have the tech and capital to start over. And you know, Fuck em.