this post was submitted on 03 Oct 2023
77 points (100.0% liked)

Chat

7499 readers
7 users here now

Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.


Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Speaking as a creative who also has gotten paid for creative work, I'm a bit flustered at how brazenly people just wax poetic about the need for copyright law, especially when the creator or artist them selves are never really considered in the first place.

It's not like yee olde piracy, which can even be ethical (like videogames being unpublished and almost erased from history), but a new form whereby small companies get to join large publishers in screwing over the standalone creator - except this time it isn't by way of predatory contracts, but by sidestepping the creator and farming data from the creator to recreate the same style and form, which could've taken years - even decades to develop.

There's also this idea that "all work is derivative anyways, nothing is original", but that sidesteps the points of having worked to form a style over nigh decades and making a living off it when someone can just come along and undo all that with a press of a button.

If you're libertarian and anarchist, be honest about that. Seems like there are a ton of tech bros who are libertarian and subversive about it to feel smort (the GPL is important btw). But at the end of the day the hidden agenda is clear: someone wants to benifit from somebody else's work without paying them and find the mental and emotional justification to do so. This is bad, because they then justify taking food out of somebody's mouth, which is par for the course in the current economic system.

It's just more proof in the pudding that the capitalist system doesn't work and will always screw the labourer in some way. It's quite possible that only the most famous of artists will be making money directly off their work in the future, similarly to musicians.

As an aside, Jay-Z and Taylor Swift complaining about not getting enough money from Spotify is tone-deaf, because they know they get the bulk of that money anyways, even the money of some account that only plays the same small bands all the time, because of the payout model of Spotify. So the big ones will always, always be more "legitimate" than small artists and in that case they've probably already paid writers and such, but maybe not.. looking at you, Jay-Z.

If the copyright cases get overwritten by the letigous lot known as corporate lawyers and they manage to finger holes into legislation that benifits both IP farmers and corporate interests, by way of models that train AI to be "far enough" away from the source material, we might see a lot of people loose their livelihoods.

Make it make sense, Beehaw =(

top 50 comments
sorted by: hot top controversial new old
[–] jarfil@beehaw.org 28 points 1 year ago (3 children)

It's just more proof in the pudding that the capitalist system doesn't work

I think that's the key part.

You seem to like making art. If you had all your living needs covered, without the need to sell any of your art... would you stop making it?

I think the AI is not the problem, the lack or sidestepping copyrights is not the problem, the mimicking a style that took decades to perfect, is also not the problem.

The real problem, is that AI increases several-fold the underlying problems of the belief in a predatory social system.

But if it helps you sleep at night, think about this: the AIs are not out here just for the artists, they're out here for all human thinking. In short time, bankers and CEOs will be begging along artists, burger flippers, and car mechanics. If there's something the LLMs have proven, is that there is no task an AI can not replicate... and the perverse twist of capitalism, is that there will be someone willing to use them for everything to cut costs, leaving essentially everyone without a job.

load more comments (3 replies)
[–] SugarApplePie@beehaw.org 26 points 1 year ago (1 children)

Make it make sense, Beehaw =(

Unfortunately AI is one of this community's blind spots so you're probably outta luck on this one. If it's not someone shyly giving themselves a pass for it because their use case is totally ethical and unlike other people using it, it's someone smugly laughing at people scared for their livelihoods as companies cut out more and more people to save a dollar here and there. The amount of people that welcome factory churned content slop will always outnumber those that still give a shit, best we can do is hope for some legislation that limits the more harmful aspects and learn to deal with the harm that can't be undone.

[–] jarfil@beehaw.org 9 points 1 year ago* (last edited 1 year ago) (1 children)

best we can do is hope for some legislation that limits the more harmful aspects and learn to deal with the harm that can't be undone.

That kind of legislation will come late, and won't change a thing.

Best we can do, is to realize the effects are only harmful if we insist on running faster and faster trying to outcompete the AIs. Nobody can outrun an AI, definitely not the ones that will be running on hardware from 5-10 years from now (expect memristor based neural net accelerators that will leave current GPU based solutions in the dust), and nobody will stop random people from using them for everything once the box has already been opened (just pray the first use won't be for war).

Fight for legislation that will stop requiring to run in the job rat maze to survive in the first place, to have a fighting chance; the alternative is a lot of suffering for everyone.

[–] SugarApplePie@beehaw.org 4 points 1 year ago* (last edited 1 year ago) (1 children)

Fight for legislation that will stop requiring to run in the job rat maze to survive in the first place, to have a fighting chance

Here, here. Or is it hear, hear? Either way I completely agree, though I very much doubt we'll see something like that in our lifetime. Still worth fighting for though!

[–] jarfil@beehaw.org 4 points 1 year ago* (last edited 1 year ago) (2 children)

This could be a start:

https://en.m.wikipedia.org/wiki/Universal_basic_income

It's an old idea, already successfully tested in some places, in some more thanks to COVID, and just needs more general awareness and support... which I think the incoming AI transition might give it.

Would be nice to have it in place before it becomes widely needed, but we'll see how it goes.

load more comments (2 replies)
[–] Shalakushka@kbin.social 19 points 1 year ago* (last edited 1 year ago)

Tech bros and capitalists are fundamentally uncreative and view all creativity as mish-mashing various obvious influences together in the pursuit of more money. So, when the opportunity comes in their eyes to totally invalidate a set of people who have learned a skill they do not believe exists and save money in the process, they see it as a natural development. They don't respect labor or art and are happy to replace both with any number of machines.

[–] commie@lemmy.dbzer0.com 15 points 1 year ago (1 children)

copyright is an antiquated solution to a non-existent problem. it needs to be abolished. if you want to get paid for your work find someone who will pay you to work.

I like the GPL as much as the next person but I like public domain even more.

[–] taanegl@beehaw.org 12 points 1 year ago* (last edited 1 year ago) (7 children)

Why is it antiquated when we live in a world where you need to pay rent? And why pay for work when you can just digitally copy the work?

What you say makes no sense. Like it could take you two decades to culminate a piece or body of work, just to have that taken away in one fell swoop. What incentive does one then have to work in arts and entrainment?

Forget independent artists, because they will fade away into the Woodworks as everything of artistic merit will suddenly be purely product - and that is not how the greatest works or body of works have been created, despite what some upper management types might tell you.

Now if you also then advocate for basic income or perhaps even some way to monetize non-copywrited work so I could pay rent... then I'm all ears.

But there's also the sneaking suspission that most people just wanna farm AI art and sell it off at the expense of independent artists, like the stupidly commodified property market makes more renters than buyers, and that is a degenerate world driven by egoism we could clean up quite nicely with some well placed nukes.. let the lizards take a stab at becoming higher reasoning beings instead.

Also, public domain has no requirement to contribute back. For that we have MIT and BSD license, supposed "copyright" licenses, whereas GNU is copy-left - because it demands contribution back... which is also why Microsoft, Google and Apple hate the GPL... but yeah, public domain is also awesome - and the scope that AI farmers should stick to.

[–] commie@lemmy.dbzer0.com 4 points 1 year ago

Why is it antiquated when we live in a world where you need to pay rent?

the statute of anne had nothing to do with paying people's rent: it was to stop the printers in london from breaking each others' knees. that's not a real threat any more so, yea, it's totally antiquated.

people share stories, songs, recipes, and tools. legally preventing people from sharing is inhumane.

[–] SkepticElliptic@beehaw.org 3 points 1 year ago* (last edited 1 year ago)

Why is it antiquated when we live in a world where you need to pay rent? And why pay for work when you can just digitally copy the work?

You wouldn't download a house.

load more comments (5 replies)
[–] ConsciousCode@beehaw.org 14 points 1 year ago* (last edited 1 year ago)

For my two cents, though this is bit off topic: AI doesn't create art, it creates media, which is why corpos love it so much. Art, as I'm defining it now, is "media created with the purpose to communicate a potentially ineffable idea to others". Current AI has no personhood, and in particular has no intentionality, so it's fundamentally incapable of creating art in the same way a hand-painted painting is inherently different from a factory-painted painting. It's not so much that the factory painting is inherently of lower quality or lesser value, but there's a kind of "non-fungible" quality to "genuine" art which isn't a simple reproduction.

Artists in a capitalist society make their living off of producing media on behalf of corporations, who only care about the media. As humans creating media, it's basically automatically art. What I see as the real problem people are grappling with is that people's right to survive is directly tied to their economic utility. If basic amenities were universal and work was something you did for extra compensation (as a simple alternative example), no one would care that AI can now produce "art" (ie media) any more than Chess stopped being a sport when Deep Blue was built because art would be something they created out of passion and compensation not tied to survival. In an ideal world, artistic pursuits would be subsidized somehow so even an artist who can't find a buyer can be compensated for their contribution to Culture.

But I recognize we don't live in an ideal world, and "it's easier to imagine the end of the world than the end of capitalism". I'm not really sure what solutions we end up with (because there will be more than one), but I think broadening copyright law is the worst possible timeline. Copyright in large part doesn't protect artists, but rather large corporations who own the fruits of other people's labor who can afford to sue for their copyright. I see copyright, patent, and to some extent trademarks as legally-sanctioned monopolies over information which fundamentally halts cultural progress and have had profoundly harmful effects on our society as-is. It made sense when it was created, but became a liability with the advent of the internet.

As an example of how corpos would abuse extended copyright: Disney sues stable diffusion models with any trace of copyrighted material into oblivion, then creates their own much more powerful model using the hundred years of art they have exclusive rights to in their vaults. Artists are now out of work because Disney doesn't need them anymore, and they're the only ones legally allowed to use this incredibly powerful technology. Any attempt to make a competing model is shut down because someone claims there's copyrighted material in their training corpus - it doesn't even matter if there is, the threat of lawsuit can shut down the project before it starts.

[–] luciole@beehaw.org 12 points 1 year ago (1 children)

I work at a small non-profit publisher and our clients respecting copyright is basically what decides if we continue existing or not. I struggle as well with the general "end all copyright" sentiment. There’s this idea that circumventing copyright means sticking it to corporations, as if their creative employees making a living don’t exist.

Furthermore, I feel that generative AI is just the latest tech bro venture based on siphoning revenues out from under existing businesses whilst escaping the laws that apply to the sector. Advertisement revenues were siphoned from under the press, hotels are facing competition from business subverting residential housing, restaurants are being charged exorbitant prices to get their goods delivered. The ambient cynicism serves to maintain indifference towards these unethical tactics.

load more comments (1 replies)
[–] chicken@lemmy.dbzer0.com 9 points 1 year ago

But at the end of the day the hidden agenda is clear: someone wants to benifit from somebody else’s work without paying them

Yes, our whole civilization would be so much richer overall if everything could be shared and everyone could benefit from the creative and intellectual work of everyone else. Artificial scarcity and copyright is an awful kludge to make this kind of work sort-of-compatible with our awful economic system, and comes at the expense of everyone.

It’s just more proof in the pudding that the capitalist system doesn’t work and will always screw the labourer in some way. It’s quite possible that only the most famous of artists will be making money directly off their work in the future, similarly to musicians.

If it doesn't work then why try to maintain the status-quo? The future you seem to be worried about will not be stopped by more restrictive rules on training data, because the big companies outright own enough media to meet that requirement anyway. And then no one else can, and their monopoly over these fantastically powerful tools that no one can compete with is much stronger. Creative workers demanding AI to be reigned in by copyright seems incredibly naive to me.

[–] MossyFeathers@pawb.social 9 points 1 year ago (1 children)

Here's my view: I like games, I want to make games. Not only do I want to make games, there are games I want to make which would require a massive team of people to accomplish. That's not cheap and I don't, nor will I likely ever have, the money to make them.

If I take it to a studio and say, "here's this game I want to make, here's a prototype showing how it'll play, the basic mechanics, here's some sketches show the general artstyle" and so forth, if they decide they like it (which is a huge if), my understanding is that they typically expect to receive ownership of the copyright for the game and all associated IPs. That means the game is no longer my game, it's now owned by the company. If I want to take that game to another company because I'm not happy with how the current company is handling it, well, that's too bad, it's not my game anymore. I don't even own the characters, the name, none of the stuff I originally pitched is mine anymore, it's now owned by the company.

AI, on the other hand, promises to eventually allow me to be able to generate models, animations, textures, and so on. This massively decreases the budget and staffing required to make the game a reality, potentially bringing the costs in line with something I can actually afford. The artists weren't replaced by AI because I couldn't afford to pay them in the first place. That's not a slight against them, I'd pay them up front if I could, but I can't; nor do I believe it's ethical or moral to string them along with the promise of profit sharing when I know full well that I'm not really interested in making a profit. I'm ultimately doing it because I want to and if I make money at it, then that's cool. If I promise to share any profit the game makes, there's a real potential that they might get pennies when they could have been making more money working for someone else. At that point I've selfishly taken food out of their mouths and wasted their time.

Being able to use AI to asset in game creation also means that while any AI-generated assets are public domain, I still get to keep whatever I made by hand, whether it's the script, the hero models, or even just the setting and character designs. I also get to have full oversight of what I'm making, I don't have to worry about business suits harassing me about whether or not my game is going to be profitable, how marketing analysis says I need to add X mechanic or focus on having Y graphics, or Z representation. It's my artistic vision, and while I may have used AI to assist in bringing it to fruition, they're simply pieces of a larger, human-created work.

Or I guess to put it another way, I understand why artists are upset by AI generating traditional artworks; however AI also has the future potential to reduce the barrier of entry for complex creative works to the point where even a highly complex game or AAA-quality movie could be done by a small group of friends, or even a single person. If you have the money, then you should absolutely pay your artists, but I also think it should be decided on a case-by-case basis.

Instead of painting it all with a broad brush, take into consideration whether or not it'd be realistically feasible for an individual or creative group to do it "right". How much was AI-generated? A little? A lot? All of it? How much is okay? Does it matter if the individual parts are generated by an AI if it was ultimately assembled and guided by a human? What situations would it be okay to use AI in? Is your view reasonable? Why or why not? Consider it not just from your perspective, but from the perspective of the person wanting to create their vision. Not all creative works are equal when it comes to the effort required to create them. Hell, not all games are equal in that regard. It's significantly easier to make a simple platformer or RPG than it is to create a Fallout or GTA.

I'm not gonna pretend I have the answers, I recognize how much damage AI can do to creative industries; however I also recognize that there's a lot of creativity going to waste because the barriers are so high for some types of creative works that AI is likely the only way they'll ever have the chance to see the light of day.

[–] frog@beehaw.org 5 points 1 year ago (1 children)

Your creative vision doesn't entitle you to profit from others' hard work, just because you don't want to put in the work to learning those skills yourself.

[–] MossyFeathers@pawb.social 3 points 1 year ago (1 children)

I imagine I'm about to talk to a brick wall, because I see that message nearly word-for-word whenever AI ethics comes up. But to hell with it. I'm already miserable, not like talking to a stubborn brick wall is going to make me anymore miserable than I already am.

That's the problem and I get the sense you didn't read my message. I know how to 3d model. I know how to make textures, how to animate, how to write, how to make sound effects. I literally know how to do nearly every part of the development process. I'm telling you that this isn't a case of not wanting to learn the skills. This is a case of game development being so ridiculously complex that the feasibility of a single person being able to create a game ranges from "easily possible" to "that's literally impossible, you'd never make it a reality even with every developer in the world working on it".

You're coming into this looking at it like every creative pursuit is the same as traditional art. You plop a skilled person down in front of a canvas and they can make a beautiful artwork all by themselves. However, the same is not true for games. I have most of the skills necessary to make a game, from scratch, and I'm telling you that this has nothing to do with being unwilling to learn new skills; this is entirely about the fact that games are so ridiculously complex that it doesn't matter what your skill set is, as it stands right now some games are so complex they can only be built as a capitalist pursuit, not as a creative one.

[–] frog@beehaw.org 3 points 1 year ago

Making a game is a team effort, I am aware of this. I'm literally a game design student. But your excuse that it's okay for you to use AI because you want to make a game alone doesn't hold much water. Other people are part of the process: all those artists whose work was strip-mined for AIs. You're basically going to profit from their work without having any responsibility to pay them for all their effort, or even the decency to get their permission,as you would for any other asset you want to use. The fact that their work is "copyright laundered" through an AI first doesn't change what you're doing, no matter how much you try to convince yourself it's okay.

[–] DavidGarcia@feddit.nl 9 points 1 year ago (8 children)

What do you think should be the alternative then?

The way I see it, you could 1) not have any models at all, which I think is shortsighted 2) hand over exclusive control over these models to big tech companies that have the money to pay these artists 3) make creative commons models that will probably never be able to compete with the big tech models. 4) Perhaps ban anything except creative commons models for personal use?

I'd much rather AI models were freely available to everyone equally. Best compromise I could see is developing some legally binding metric that determines wether the output you want to use commercially is similar enough to some artist, so you have to reimburse them.

[–] taanegl@beehaw.org 10 points 1 year ago* (last edited 1 year ago) (1 children)

Can't put the genie back in the bottle, I guess =\ seems the only real protected forms is modern art, because nobody understands that anyways ^^;

I'm thinking the problem of AI has to be solved by AI, that those decades need to be replaced with AI training - like you said, having it generally available.

But that too leaves an outlier, people who don't want to work with AI. Their only option is to never digitally publish and make all their work bounce light so that cameras can't capture it. It'd be physical DRM in a sense.

I don't really want to work with AI, because it takes away the process I love, but in the end we're sort of forced to do so =\ It's like the Industria and digital revolution all over again. Some people (like me) will be dragged kicking and screaming into the future.

[–] DavidGarcia@feddit.nl 3 points 1 year ago

I think there will always be a market for real physical artists. Yeah you can boxed wine, but people pay to get the real artisinal stuff. Pretty sure real art will become a similarly highly sought after luxury product. If you really like the process and keep at it, you probably won't have that much competition, because there will be less and less people with that skillset. There's mass manufactured Ikea furniture, but people still buy handmade tables for ridiculous prices.

And who knows, maybe AI will grow on you too.

Or you'll be highly sought after once we finally inevitably ban AI lol.

So the future isn't all doom and gloom, if you ask me.

[–] taanegl@beehaw.org 5 points 1 year ago (1 children)

But yeah, I'm no lawyer. I have no idea how to legally solve this problem, but I suspect that eventually no law can solve the problem, once the works become so good at distinguishing itself, dropping the price so low that doing the real work manually becomes a non starter, or a hobby more or less.

Humans were meant to work with their hands and minds :( now it's all keyboards and screens. I tried to get away from that, but they all just keep pulling me back in!

We're being enslaved by our computers :(

load more comments (1 replies)
load more comments (6 replies)
[–] Omega_Haxors@lemmy.ml 8 points 1 year ago* (last edited 1 year ago) (3 children)

piracy, which can even be ethical

Nah it flat out is always ethical. Nintendo and steam is proof that everyone benefits from piracy. Yes, even the companies.

"But they're stealing my intelectu-" 1) Nobody's stealing shit, it's still there and 2) If your business model involves depriving people who want your product of your product, that's very much a you problem. Don't screw everyone else over to chase dollars that were never there.

"But then the people who made it aren't getting pa-" Newsflash, the Hollywood strike is proof that you aren't fucking paying them anyway.

"You're just defending your shitty behavior because you download stuff for free" Literally never pirated a thing in my life, and probably never will. If I want something I buy it. If I can't buy it, I lose interest and move on to something else.

[–] Fizz@lemmy.nz 11 points 1 year ago* (last edited 1 year ago) (3 children)

People's want for a product is irrelevant. Just because you want a game doesn't make it ethical to download it illegally. You can say your only doing it Because Nintendo doesn't provide a way for you to play it but they might have plans go re-release the game and after pirating the game you may no longer buy that release.

I pirate games and media and I think its unethical. I just don't want to be ethical to cunty corporations like Nintendo.

[–] luciole@beehaw.org 8 points 1 year ago (1 children)

You’re not being ethical to the artists and devs that work at Nintendo though. You’re pissed at the leadership but it’s never them who gets sacked when the shit hits the fan.

load more comments (1 replies)
[–] Omega_Haxors@lemmy.ml 5 points 1 year ago* (last edited 1 year ago)

I pirate games and media and I think its unethical. I just don’t want to be ethical to cunty corporations like Nintendo.

Gigachad.

[–] jarfil@beehaw.org 4 points 1 year ago

Just because you want a game doesn't make it ethical to download it ~~illegally~~

Not all laws are ethical, so I'll skip that part.

What makes "pirating" content ethical, are basically two cases:

  1. You don't have the money to pay for it. Whether you download it or not, the producer will get paid exactly the same.
  2. The producer is re-releasing the same content you already paid for over and over, asking full price for it every time, while blocking you from using your already paid-for version.
[–] memfree@beehaw.org 6 points 1 year ago (1 children)

I can't see a case for pirating to be ethical in the case where you create the story/paint the picture/write the song/build the machine, and then Disney/Time Warner/Sony/Amazon pirates it and sells it for profit while you get nothing.

[–] Omega_Haxors@lemmy.ml 3 points 1 year ago (1 children)

If they genuinely did that it would open up the floodgates and capitalism would collapse in less than a month.

[–] frog@beehaw.org 3 points 1 year ago

It already happens, though. Big companies do it to the little guys all the time, they just get away with it by throwing money at the courts until the little guy suing them runs out of money and/or dies. (Literally in some cases.) It would only cause capitalism to collapse if a big company did it to another big company: say Amazon started pirating Disney's stuff.

[–] taanegl@beehaw.org 5 points 1 year ago (2 children)

All of that is true, but I mentioned all that - especially the predatory contracts. Now the general public seems to be missing the gist of copyright as well, probably since the likes of Disney has gamed copyright law.

But why should all that affect independent artists, who barely make enough as it is? With generative AI basing it's learning models, not on classical works or anything in the public domain, but directly off modern artist works, who spent maybe a decade or two of honing skills, finding stylistic angles and breathing new life into old formats, only for someone to swallow it all up, make a few tweaks, with a small payment given to some data centres? =\

How does that make sense?

[–] Omega_Haxors@lemmy.ml 9 points 1 year ago (1 children)

AI art is genuine theft because you're taking works and not crediting the original and then claiming it as your own.

[–] taanegl@beehaw.org 4 points 1 year ago

Unfortunately, that is true - and also what everyone ignores because AI got them pumped for a bright future of working less, not knowing that people who spent the passed decade or so honing their craft can just throw all that in the trash.

It's a damned if you do, damned if you don't situation. Everyone has to use AI now =\ wether you like it or not... or come up with a format that just messes with AI and let's humans understand it due to some flaw in our senses that the AI can't make out... though that seems very unlikely.

Maybe a new anti-AI codec of some sort that prevents AI from interpreting audio, images or video... but that would be a stopgap solution, as it would almost certainly be circumvented.. unless the codec comes with a new strict license that promises to sue the ever loving crap out of tech companies who facilitate travel, distribution and recreation of works in said codecs.

load more comments (1 replies)
[–] blazera@kbin.social 8 points 1 year ago (1 children)

Copyright only exists so rich people can own even more things through money alone, without having to do any of the work themselves.

[–] collegefurtrader@discuss.tchncs.de 3 points 1 year ago (1 children)
[–] notfromhere@lemmy.one 5 points 1 year ago (1 children)

Case in point, the conversation we are having. Corporations ignore copyright when it’s in their favor. I see stories all the time about some huge corporation ripping off the work of an individual artist.

load more comments (1 replies)
[–] gamermanh@lemmy.dbzer0.com 7 points 1 year ago (1 children)

Ah, people being scared of new technology in their field is a funny thing to watch in real time

[–] jarfil@beehaw.org 25 points 1 year ago (1 children)

Don't make fun of people being scared. Some have invested decades into honing skills that are becoming obsolete, have some empathy.

[–] gamermanh@lemmy.dbzer0.com 3 points 1 year ago (1 children)

The skills will not be obsolete, I guarantee there will be a market for people to still do all of the drawing/digital art/whatever they do

There will also be AI tools that they will likely need to learn or be they will be left behind by the majority, sure, but that's what happens when a new tool shakes up your industry

Also, never made fun of anyone or didn't have empathy, I said it was funny to watch in real time as an industry shifts to new technology, so chill

load more comments (1 replies)
[–] lemillionsocks@beehaw.org 6 points 1 year ago

The thing that gets me is people trying to rationalize it as "well dont people learn from and get influenced by referencing other peoples work" and it's like yeah but a person cant do that as quickly as an AI can and then that individual cant then go on to work for thousands or millions of people at once. Also its so transparently clear that once this tech matures it will be used by major companies and employers that hire creatives as a way to not have to pay actual artists. The savings then get passed on to executives up top.

I feel like I dont have an issue with AI being able to create cool stuff, but if you want to make are "free" and for the "masses" then you cant make money off of it. Full stop.

[–] doricub@beehaw.org 6 points 1 year ago

The way I see it is the main problem is actually the training databases. If these companies have gathered a giant database for use in training and not paid the people who created the training material, then they are engaging in piracy. It seems like they should be paying royalties to whoever owns the rights to their training material for each use of the AI.

[–] sculd@beehaw.org 5 points 1 year ago (1 children)
  1. AI is trained on years, even centuries of work made by generations of people.

  2. AI then threatens to replace hundreds of thousands of jobs, to the benefit of huge corporations who could afford to deploy AI.

  3. AI could not entirely replace human input at the current stage, but it definitely replaces entry level jobs. Leaving little room to grow for new graduates.

  4. Since AI will not get tired and will not complain, major corporations really like them (See Hollywood executives).

  5. We must ACT NOW. (Like the writers guild in the US.)

This is speaking from a writer's perspective, your mileage may vary. I used to ask my younger colleague to help with first drafts. Now it may be faster to just use ChatGPT. So how could they grow to become an editor?

[–] ConsciousCode@beehaw.org 5 points 1 year ago

SAG-AFTRA was very smart to make AI writing a wedge issue. The technology isn't quite there yet, but it will be very soon and by that point it would've been too late to assert their rights.

[–] Thelsim@beehaw.org 4 points 1 year ago (1 children)

So.. first things first. I'm a happy Midjourney user and post quite a bit of stuff over at one of the other Lemmy communities (same name, different account). But, I only use the AI for fun and never for profit. I can give tons of justifications but in the end it comes down to this: I'm a crappy artist and I have a vivid imagination. AI gives me an outlet to visualize the things in my head and the rush of seeing them in real is really nice.

That being said. One of the things I don't do, is write prompts like "in the style of ....". Specifically because I don't want it to be a copy of someone's work, even if it is for personal use. It feels (and obviously is) wrong.
Maybe not a perfect solution, but they should remove all the artist names (those alive or less than 50(?) years dead) from the current models. If your name isn't in it, then it'll be a lot harder to recreate your style.
In the longer run, a register of what prompt and which model were used for AI generated images might help with copyright claims? The EU is already busy with legislation for registering AI models. This might be a logical follow-up?

I'm just throwing out ideas at this point. I'm not an expert in any of these fields (AI, legal, copyright, etc.) All I know is that it would definitely be a net loss for society if small artists are no longer able to make a living practicing their profession.

[–] jarfil@beehaw.org 3 points 1 year ago (2 children)

If your name isn't in it, then it'll be a lot harder to recreate your style.

Harder, but not imposible. There are already prompt dictionaries out there, and if you check some mobile apps that offer AI art generation, you can see how they offer "styles" that clearly append some behind the scenes settings to the prompt. Some also carry prompt dictionaries directly.

Midjourney is also just a centralized version of stable diffusion, you can run the software on your own, with whatever LoRA modifiers you want, including one "in the style of [...]".

load more comments (2 replies)
[–] A1kmm@lemmy.amxl.com 3 points 1 year ago

If AI generated art is a close derivative of another work, then copyright already applies.

But when it comes to vague abstractions over multiple works that isn't like any one of them, copyright is probably not the right fix for what is fundamentally a more general problem. Copyright has never covered that sort of thing, so you would be asking for an unprecedented expansion to copyright, and that would have immense negative consequences that would do more harm than good.

There are two ways I could see in which copyright could be extended (both of which are a bad idea, as I'd explain).

Option 1 would be to take a 'colour of bits' approach (borrowing the terminology from https://ansuz.sooke.bc.ca/entry/23). The analogy of 'bits' having a colour and not just being a 0 or 1 has been used to explain how to be conservative about ensuring something couldn't possibly be a copyright violation - if a bit that is coloured with copyright is used to compute another bit in any way (even through combination with another untainted bit), then that bit is itself coloured with copyright. The colour of bits is not currently how copyright law works, but it is a heuristic that is overly conservative right now of how to avoid copyright violation. Theoretically the laws around copyright and computing could change to make the colour of bits approach the law. This approach, taken strictly, would mean that virtually all the commercial LLMs and Stable Diffusion models are coloured with the copyrights of all inputs that went into them, and any output from the models would be similarly coloured (and hence in practice be impossible to use legally).

There are two major problems with this: firstly, AI models are essentially a rudimentary simulation of human thinking (neural networks are in fact inspired by animal neurons). Applying the same rule to humans would mean that if you've ever read a copyrighted book, everything you ever say, write, draw or otherwise create after that is copyright to the author of that book. Applying a different rule to computers than to humans would mean essentially ruling out ever automating many things that humans can do - it seems like an anti-tech agenda. Limiting technology solely for the benefit some people now seems short sighted. Remember, once people made their livelihoods in the industry of cutting ice from the arctic and distributing it on ships for people to keep their food cold. Their made their livelihoods lighting gas lamps around cities at dawn and extinguishing them at dusk. Society could have banned compressors in refrigerators and electric lighting to preserve those livelihoods, but instead, society advanced, everyone's lives got better, and people found new livelihoods. So a colour of bits approach either applies to humans, and becomes an unworkable mess of every author you've ever read basically owns all your work, or it amounts to banning automation in cases where humans can legally do something.

The second problem with the colour of bits approach is that it would undermine a lot of things that we have already been doing for decades. Classifiers, for example, are often trained on copyrighted inputs, and make decisions about what category something is in. For example, most email clients let you flag a message as spam, and use that to decide if a future message is spam. A colour of bits approach would mean the model that decides whether or not a message is spam is copyright to whoever wrote the spam - and even the Yes/No decision is also copyright to them, and you'd need their permission to rely on it. Similarly for models that detect abuse or child pornography or terrorist material on many sites that accept user-generated content. Many more models that are incredibly important to day-to-day life would likely be impacted in the same way - so it would be incredibly disruptive to tech and life as we know it.

Another approach to extending copyright, also ill-advised, would be to extend copyright to protect more general elements like 'style', so that styles can be copyrighted even if another image doesn't look the same. If this was broadened a long way, it would probably just lead to constant battles between artists (or more likely, studios trying to shut down artists), and it is quite likely that no artist could ever publish anything without a high risk of being sued.

So copyright is probably not a viable solution here, so what is? As we move to a 'post-scarcity' economy, with things automated to the extent that we don't need that many humans working to produce an adequate quality of life for everyone, the best solution is a Universal Basic Income (UBI). Everyone who is making something in the future and generating profits is almost certainly using work from me, you, and nearly every person alive today (or their ancestors) to do so. But rather than some insanely complex computation about who contributed the most that becomes unworkable, just tax all profit to cover it, and pay a basic income to everyone. Then artists (and everyone else) can focus on meaning and not profit, knowing they will still get paid the UBI no matter what, and contribute back to the commons, and copyright as a concept can be essentially retired.

load more comments
view more: next ›