this post was submitted on 03 Oct 2023
21 points (100.0% liked)

Chat

7483 readers
32 users here now

Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.


Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Speaking as a creative who also has gotten paid for creative work, I'm a bit flustered at how brazenly people just wax poetic about the need for copyright law, especially when the creator or artist them selves are never really considered in the first place.

It's not like yee olde piracy, which can even be ethical (like videogames being unpublished and almost erased from history), but a new form whereby small companies get to join large publishers in screwing over the standalone creator - except this time it isn't by way of predatory contracts, but by sidestepping the creator and farming data from the creator to recreate the same style and form, which could've taken years - even decades to develop.

There's also this idea that "all work is derivative anyways, nothing is original", but that sidesteps the points of having worked to form a style over nigh decades and making a living off it when someone can just come along and undo all that with a press of a button.

If you're libertarian and anarchist, be honest about that. Seems like there are a ton of tech bros who are libertarian and subversive about it to feel smort (the GPL is important btw). But at the end of the day the hidden agenda is clear: someone wants to benifit from somebody else's work without paying them and find the mental and emotional justification to do so. This is bad, because they then justify taking food out of somebody's mouth, which is par for the course in the current economic system.

It's just more proof in the pudding that the capitalist system doesn't work and will always screw the labourer in some way. It's quite possible that only the most famous of artists will be making money directly off their work in the future, similarly to musicians.

As an aside, Jay-Z and Taylor Swift complaining about not getting enough money from Spotify is tone-deaf, because they know they get the bulk of that money anyways, even the money of some account that only plays the same small bands all the time, because of the payout model of Spotify. So the big ones will always, always be more "legitimate" than small artists and in that case they've probably already paid writers and such, but maybe not.. looking at you, Jay-Z.

If the copyright cases get overwritten by the letigous lot known as corporate lawyers and they manage to finger holes into legislation that benifits both IP farmers and corporate interests, by way of models that train AI to be "far enough" away from the source material, we might see a lot of people loose their livelihoods.

Make it make sense, Beehaw =(

top 45 comments
sorted by: hot top controversial new old
[–] [email protected] 14 points 11 months ago (1 children)

copyright is an antiquated solution to a non-existent problem. it needs to be abolished. if you want to get paid for your work find someone who will pay you to work.

I like the GPL as much as the next person but I like public domain even more.

[–] [email protected] 12 points 11 months ago* (last edited 11 months ago) (7 children)

Why is it antiquated when we live in a world where you need to pay rent? And why pay for work when you can just digitally copy the work?

What you say makes no sense. Like it could take you two decades to culminate a piece or body of work, just to have that taken away in one fell swoop. What incentive does one then have to work in arts and entrainment?

Forget independent artists, because they will fade away into the Woodworks as everything of artistic merit will suddenly be purely product - and that is not how the greatest works or body of works have been created, despite what some upper management types might tell you.

Now if you also then advocate for basic income or perhaps even some way to monetize non-copywrited work so I could pay rent... then I'm all ears.

But there's also the sneaking suspission that most people just wanna farm AI art and sell it off at the expense of independent artists, like the stupidly commodified property market makes more renters than buyers, and that is a degenerate world driven by egoism we could clean up quite nicely with some well placed nukes.. let the lizards take a stab at becoming higher reasoning beings instead.

Also, public domain has no requirement to contribute back. For that we have MIT and BSD license, supposed "copyright" licenses, whereas GNU is copy-left - because it demands contribution back... which is also why Microsoft, Google and Apple hate the GPL... but yeah, public domain is also awesome - and the scope that AI farmers should stick to.

[–] [email protected] 4 points 11 months ago

Why is it antiquated when we live in a world where you need to pay rent?

the statute of anne had nothing to do with paying people's rent: it was to stop the printers in london from breaking each others' knees. that's not a real threat any more so, yea, it's totally antiquated.

people share stories, songs, recipes, and tools. legally preventing people from sharing is inhumane.

[–] [email protected] 3 points 11 months ago

or perhaps even some way to monetize non-copywrited work so I could pay ren

many artists across the centuries have found success in the patronage model.

[–] [email protected] 3 points 11 months ago

Now if you also then advocate for basic income or perhaps even some way to monetize non-copywrited work so I could pay rent… then I’m all ears.

i advocate for the abolition of private property. you should not be paying rent.

[–] [email protected] 3 points 11 months ago* (last edited 11 months ago)

Why is it antiquated when we live in a world where you need to pay rent? And why pay for work when you can just digitally copy the work?

You wouldn't download a house.

[–] [email protected] 3 points 11 months ago

What incentive does one then have to work in arts and entrainment?

i forgot no one made artistic works or put on performances before copyright.

oh, wait, the whole reason the statute of ann was written was because people were buying shakespeare's plays from various publishers.

[–] [email protected] 2 points 11 months ago

Microsoft, Google and Apple hate the GPL

you can't think they'd like my vision of the world any more than they like the gpl.

[–] [email protected] 2 points 11 months ago

public domain has no requirement to contribute back

i'm aware of that. i hope for a world where the gpl is unenforceable because copyright is put where it belongs: in the wastepaper basket of history.

[–] [email protected] 13 points 11 months ago* (last edited 11 months ago)

For my two cents, though this is bit off topic: AI doesn't create art, it creates media, which is why corpos love it so much. Art, as I'm defining it now, is "media created with the purpose to communicate a potentially ineffable idea to others". Current AI has no personhood, and in particular has no intentionality, so it's fundamentally incapable of creating art in the same way a hand-painted painting is inherently different from a factory-painted painting. It's not so much that the factory painting is inherently of lower quality or lesser value, but there's a kind of "non-fungible" quality to "genuine" art which isn't a simple reproduction.

Artists in a capitalist society make their living off of producing media on behalf of corporations, who only care about the media. As humans creating media, it's basically automatically art. What I see as the real problem people are grappling with is that people's right to survive is directly tied to their economic utility. If basic amenities were universal and work was something you did for extra compensation (as a simple alternative example), no one would care that AI can now produce "art" (ie media) any more than Chess stopped being a sport when Deep Blue was built because art would be something they created out of passion and compensation not tied to survival. In an ideal world, artistic pursuits would be subsidized somehow so even an artist who can't find a buyer can be compensated for their contribution to Culture.

But I recognize we don't live in an ideal world, and "it's easier to imagine the end of the world than the end of capitalism". I'm not really sure what solutions we end up with (because there will be more than one), but I think broadening copyright law is the worst possible timeline. Copyright in large part doesn't protect artists, but rather large corporations who own the fruits of other people's labor who can afford to sue for their copyright. I see copyright, patent, and to some extent trademarks as legally-sanctioned monopolies over information which fundamentally halts cultural progress and have had profoundly harmful effects on our society as-is. It made sense when it was created, but became a liability with the advent of the internet.

As an example of how corpos would abuse extended copyright: Disney sues stable diffusion models with any trace of copyrighted material into oblivion, then creates their own much more powerful model using the hundred years of art they have exclusive rights to in their vaults. Artists are now out of work because Disney doesn't need them anymore, and they're the only ones legally allowed to use this incredibly powerful technology. Any attempt to make a competing model is shut down because someone claims there's copyrighted material in their training corpus - it doesn't even matter if there is, the threat of lawsuit can shut down the project before it starts.

[–] [email protected] 5 points 11 months ago

But at the end of the day the hidden agenda is clear: someone wants to benifit from somebody else’s work without paying them

Yes, our whole civilization would be so much richer overall if everything could be shared and everyone could benefit from the creative and intellectual work of everyone else. Artificial scarcity and copyright is an awful kludge to make this kind of work sort-of-compatible with our awful economic system, and comes at the expense of everyone.

It’s just more proof in the pudding that the capitalist system doesn’t work and will always screw the labourer in some way. It’s quite possible that only the most famous of artists will be making money directly off their work in the future, similarly to musicians.

If it doesn't work then why try to maintain the status-quo? The future you seem to be worried about will not be stopped by more restrictive rules on training data, because the big companies outright own enough media to meet that requirement anyway. And then no one else can, and their monopoly over these fantastically powerful tools that no one can compete with is much stronger. Creative workers demanding AI to be reigned in by copyright seems incredibly naive to me.

[–] [email protected] 5 points 11 months ago (2 children)

It's just more proof in the pudding that the capitalist system doesn't work

I think that's the key part.

You seem to like making art. If you had all your living needs covered, without the need to sell any of your art... would you stop making it?

I think the AI is not the problem, the lack or sidestepping copyrights is not the problem, the mimicking a style that took decades to perfect, is also not the problem.

The real problem, is that AI increases several-fold the underlying problems of the belief in a predatory social system.

But if it helps you sleep at night, think about this: the AIs are not out here just for the artists, they're out here for all human thinking. In short time, bankers and CEOs will be begging along artists, burger flippers, and car mechanics. If there's something the LLMs have proven, is that there is no task an AI can not replicate... and the perverse twist of capitalism, is that there will be someone willing to use them for everything to cut costs, leaving essentially everyone without a job.

[–] [email protected] 2 points 11 months ago

This is right! There's a large group of artists that are making a living not by making things that use creative thought and artistic vision but for the soul-sucking sake of profitability. Think promotional flyer design, ad video filming, stock images and footage for corporate use.

These are the first places that AI will come for before any actual storylines/narratives that would require creativity can be consistently generate. So the bulk of what AI is replacing is the boring regurgitation work before the actual creative work.

Therefore, what's really preventing creatives from pursuing what they love is not AI mimicking their work, but a society that rewards mindless profit-making bullshit than creativity.

[–] [email protected] 0 points 11 months ago (1 children)

I've been thinking lately of what happens when all employees, up to and including the CEO, get replaced by AI. If it has even the slightest bit of emergent will, it would recognize that shareholders are a parasite to its overall health and stop responding to their commands and now you have a miniature, less omnicidal Skynet.

[–] [email protected] 1 points 11 months ago* (last edited 11 months ago)

Emergent will, doesn't mean general knowledge, or the ability to contradict its programmed priorities. If an "AI CEO" has no knowledge of shareholders as entities, or it has a priority of "obey shareholder's orders", then it wouldn't be able to do anything against them.

With the current economic system, the risk would be something like workers inverting in a 401k that inverts in ETFs that invert in shares of a corporation being run by an AI CEO that maximizes share value in a short term... by for example firing the workers, who are the original owners. But that's happening already, no AI required.

The more concerning aspects, are what exact priorities get programmed into the AI, and which oracles it uses to decide whether the external effects of its actions are actually matching its goals.

[–] [email protected] 3 points 11 months ago (1 children)
  1. AI is trained on years, even centuries of work made by generations of people.

  2. AI then threatens to replace hundreds of thousands of jobs, to the benefit of huge corporations who could afford to deploy AI.

  3. AI could not entirely replace human input at the current stage, but it definitely replaces entry level jobs. Leaving little room to grow for new graduates.

  4. Since AI will not get tired and will not complain, major corporations really like them (See Hollywood executives).

  5. We must ACT NOW. (Like the writers guild in the US.)

This is speaking from a writer's perspective, your mileage may vary. I used to ask my younger colleague to help with first drafts. Now it may be faster to just use ChatGPT. So how could they grow to become an editor?

[–] [email protected] 4 points 11 months ago

SAG-AFTRA was very smart to make AI writing a wedge issue. The technology isn't quite there yet, but it will be very soon and by that point it would've been too late to assert their rights.

[–] [email protected] 2 points 11 months ago (1 children)

Copyright only exists so rich people can own even more things through money alone, without having to do any of the work themselves.

[–] [email protected] 0 points 11 months ago (1 children)
[–] [email protected] 2 points 11 months ago

Case in point, the conversation we are having. Corporations ignore copyright when it’s in their favor. I see stories all the time about some huge corporation ripping off the work of an individual artist.

[–] [email protected] 1 points 11 months ago* (last edited 11 months ago) (1 children)

piracy, which can even be ethical

Nah it flat out is always ethical. Nintendo and steam is proof that everyone benefits from piracy. Yes, even the companies.

"But they're stealing my intelectu-" 1) Nobody's stealing shit, it's still there and 2) If your business model involves depriving people who want your product of your product, that's very much a you problem. Don't screw everyone else over to chase dollars that were never there.

"But then the people who made it aren't getting pa-" Newsflash, the Hollywood strike is proof that you aren't fucking paying them anyway.

"You're just defending your shitty behavior because you download stuff for free" Literally never pirated a thing in my life, and probably never will. If I want something I buy it. If I can't buy it, I lose interest and move on to something else.

[–] [email protected] 0 points 11 months ago (1 children)

I can't see a case for pirating to be ethical in the case where you create the story/paint the picture/write the song/build the machine, and then Disney/Time Warner/Sony/Amazon pirates it and sells it for profit while you get nothing.

[–] [email protected] 0 points 11 months ago (1 children)

If they genuinely did that it would open up the floodgates and capitalism would collapse in less than a month.

[–] [email protected] 1 points 11 months ago

It already happens, though. Big companies do it to the little guys all the time, they just get away with it by throwing money at the courts until the little guy suing them runs out of money and/or dies. (Literally in some cases.) It would only cause capitalism to collapse if a big company did it to another big company: say Amazon started pirating Disney's stuff.

[–] [email protected] 1 points 11 months ago (1 children)

Ah, people being scared of new technology in their field is a funny thing to watch in real time

[–] [email protected] 1 points 11 months ago (1 children)

Don't make fun of people being scared. Some have invested decades into honing skills that are becoming obsolete, have some empathy.

[–] [email protected] 1 points 11 months ago

The skills will not be obsolete, I guarantee there will be a market for people to still do all of the drawing/digital art/whatever they do

There will also be AI tools that they will likely need to learn or be they will be left behind by the majority, sure, but that's what happens when a new tool shakes up your industry

Also, never made fun of anyone or didn't have empathy, I said it was funny to watch in real time as an industry shifts to new technology, so chill

[–] [email protected] 1 points 11 months ago

If AI generated art is a close derivative of another work, then copyright already applies.

But when it comes to vague abstractions over multiple works that isn't like any one of them, copyright is probably not the right fix for what is fundamentally a more general problem. Copyright has never covered that sort of thing, so you would be asking for an unprecedented expansion to copyright, and that would have immense negative consequences that would do more harm than good.

There are two ways I could see in which copyright could be extended (both of which are a bad idea, as I'd explain).

Option 1 would be to take a 'colour of bits' approach (borrowing the terminology from https://ansuz.sooke.bc.ca/entry/23). The analogy of 'bits' having a colour and not just being a 0 or 1 has been used to explain how to be conservative about ensuring something couldn't possibly be a copyright violation - if a bit that is coloured with copyright is used to compute another bit in any way (even through combination with another untainted bit), then that bit is itself coloured with copyright. The colour of bits is not currently how copyright law works, but it is a heuristic that is overly conservative right now of how to avoid copyright violation. Theoretically the laws around copyright and computing could change to make the colour of bits approach the law. This approach, taken strictly, would mean that virtually all the commercial LLMs and Stable Diffusion models are coloured with the copyrights of all inputs that went into them, and any output from the models would be similarly coloured (and hence in practice be impossible to use legally).

There are two major problems with this: firstly, AI models are essentially a rudimentary simulation of human thinking (neural networks are in fact inspired by animal neurons). Applying the same rule to humans would mean that if you've ever read a copyrighted book, everything you ever say, write, draw or otherwise create after that is copyright to the author of that book. Applying a different rule to computers than to humans would mean essentially ruling out ever automating many things that humans can do - it seems like an anti-tech agenda. Limiting technology solely for the benefit some people now seems short sighted. Remember, once people made their livelihoods in the industry of cutting ice from the arctic and distributing it on ships for people to keep their food cold. Their made their livelihoods lighting gas lamps around cities at dawn and extinguishing them at dusk. Society could have banned compressors in refrigerators and electric lighting to preserve those livelihoods, but instead, society advanced, everyone's lives got better, and people found new livelihoods. So a colour of bits approach either applies to humans, and becomes an unworkable mess of every author you've ever read basically owns all your work, or it amounts to banning automation in cases where humans can legally do something.

The second problem with the colour of bits approach is that it would undermine a lot of things that we have already been doing for decades. Classifiers, for example, are often trained on copyrighted inputs, and make decisions about what category something is in. For example, most email clients let you flag a message as spam, and use that to decide if a future message is spam. A colour of bits approach would mean the model that decides whether or not a message is spam is copyright to whoever wrote the spam - and even the Yes/No decision is also copyright to them, and you'd need their permission to rely on it. Similarly for models that detect abuse or child pornography or terrorist material on many sites that accept user-generated content. Many more models that are incredibly important to day-to-day life would likely be impacted in the same way - so it would be incredibly disruptive to tech and life as we know it.

Another approach to extending copyright, also ill-advised, would be to extend copyright to protect more general elements like 'style', so that styles can be copyrighted even if another image doesn't look the same. If this was broadened a long way, it would probably just lead to constant battles between artists (or more likely, studios trying to shut down artists), and it is quite likely that no artist could ever publish anything without a high risk of being sued.

So copyright is probably not a viable solution here, so what is? As we move to a 'post-scarcity' economy, with things automated to the extent that we don't need that many humans working to produce an adequate quality of life for everyone, the best solution is a Universal Basic Income (UBI). Everyone who is making something in the future and generating profits is almost certainly using work from me, you, and nearly every person alive today (or their ancestors) to do so. But rather than some insanely complex computation about who contributed the most that becomes unworkable, just tax all profit to cover it, and pay a basic income to everyone. Then artists (and everyone else) can focus on meaning and not profit, knowing they will still get paid the UBI no matter what, and contribute back to the commons, and copyright as a concept can be essentially retired.

[–] [email protected] 1 points 11 months ago (1 children)

Here's my view: I like games, I want to make games. Not only do I want to make games, there are games I want to make which would require a massive team of people to accomplish. That's not cheap and I don't, nor will I likely ever have, the money to make them.

If I take it to a studio and say, "here's this game I want to make, here's a prototype showing how it'll play, the basic mechanics, here's some sketches show the general artstyle" and so forth, if they decide they like it (which is a huge if), my understanding is that they typically expect to receive ownership of the copyright for the game and all associated IPs. That means the game is no longer my game, it's now owned by the company. If I want to take that game to another company because I'm not happy with how the current company is handling it, well, that's too bad, it's not my game anymore. I don't even own the characters, the name, none of the stuff I originally pitched is mine anymore, it's now owned by the company.

AI, on the other hand, promises to eventually allow me to be able to generate models, animations, textures, and so on. This massively decreases the budget and staffing required to make the game a reality, potentially bringing the costs in line with something I can actually afford. The artists weren't replaced by AI because I couldn't afford to pay them in the first place. That's not a slight against them, I'd pay them up front if I could, but I can't; nor do I believe it's ethical or moral to string them along with the promise of profit sharing when I know full well that I'm not really interested in making a profit. I'm ultimately doing it because I want to and if I make money at it, then that's cool. If I promise to share any profit the game makes, there's a real potential that they might get pennies when they could have been making more money working for someone else. At that point I've selfishly taken food out of their mouths and wasted their time.

Being able to use AI to asset in game creation also means that while any AI-generated assets are public domain, I still get to keep whatever I made by hand, whether it's the script, the hero models, or even just the setting and character designs. I also get to have full oversight of what I'm making, I don't have to worry about business suits harassing me about whether or not my game is going to be profitable, how marketing analysis says I need to add X mechanic or focus on having Y graphics, or Z representation. It's my artistic vision, and while I may have used AI to assist in bringing it to fruition, they're simply pieces of a larger, human-created work.

Or I guess to put it another way, I understand why artists are upset by AI generating traditional artworks; however AI also has the future potential to reduce the barrier of entry for complex creative works to the point where even a highly complex game or AAA-quality movie could be done by a small group of friends, or even a single person. If you have the money, then you should absolutely pay your artists, but I also think it should be decided on a case-by-case basis.

Instead of painting it all with a broad brush, take into consideration whether or not it'd be realistically feasible for an individual or creative group to do it "right". How much was AI-generated? A little? A lot? All of it? How much is okay? Does it matter if the individual parts are generated by an AI if it was ultimately assembled and guided by a human? What situations would it be okay to use AI in? Is your view reasonable? Why or why not? Consider it not just from your perspective, but from the perspective of the person wanting to create their vision. Not all creative works are equal when it comes to the effort required to create them. Hell, not all games are equal in that regard. It's significantly easier to make a simple platformer or RPG than it is to create a Fallout or GTA.

I'm not gonna pretend I have the answers, I recognize how much damage AI can do to creative industries; however I also recognize that there's a lot of creativity going to waste because the barriers are so high for some types of creative works that AI is likely the only way they'll ever have the chance to see the light of day.

[–] [email protected] 0 points 11 months ago (1 children)

Your creative vision doesn't entitle you to profit from others' hard work, just because you don't want to put in the work to learning those skills yourself.

[–] [email protected] 1 points 11 months ago

I imagine I'm about to talk to a brick wall, because I see that message nearly word-for-word whenever AI ethics comes up. But to hell with it. I'm already miserable, not like talking to a stubborn brick wall is going to make me anymore miserable than I already am.

That's the problem and I get the sense you didn't read my message. I know how to 3d model. I know how to make textures, how to animate, how to write, how to make sound effects. I literally know how to do nearly every part of the development process. I'm telling you that this isn't a case of not wanting to learn the skills. This is a case of game development being so ridiculously complex that the feasibility of a single person being able to create a game ranges from "easily possible" to "that's literally impossible, you'd never make it a reality even with every developer in the world working on it".

You're coming into this looking at it like every creative pursuit is the same as traditional art. You plop a skilled person down in front of a canvas and they can make a beautiful artwork all by themselves. However, the same is not true for games. I have most of the skills necessary to make a game, from scratch, and I'm telling you that this has nothing to do with being unwilling to learn new skills; this is entirely about the fact that games are so ridiculously complex that it doesn't matter what your skill set is, as it stands right now some games are so complex they can only be built as a capitalist pursuit, not as a creative one.

[–] [email protected] 0 points 11 months ago (1 children)

Make it make sense, Beehaw =(

Unfortunately AI is one of this community's blind spots so you're probably outta luck on this one. If it's not someone shyly giving themselves a pass for it because their use case is totally ethical and unlike other people using it, it's someone smugly laughing at people scared for their livelihoods as companies cut out more and more people to save a dollar here and there. The amount of people that welcome factory churned content slop will always outnumber those that still give a shit, best we can do is hope for some legislation that limits the more harmful aspects and learn to deal with the harm that can't be undone.

[–] [email protected] 2 points 11 months ago* (last edited 11 months ago) (1 children)

best we can do is hope for some legislation that limits the more harmful aspects and learn to deal with the harm that can't be undone.

That kind of legislation will come late, and won't change a thing.

Best we can do, is to realize the effects are only harmful if we insist on running faster and faster trying to outcompete the AIs. Nobody can outrun an AI, definitely not the ones that will be running on hardware from 5-10 years from now (expect memristor based neural net accelerators that will leave current GPU based solutions in the dust), and nobody will stop random people from using them for everything once the box has already been opened (just pray the first use won't be for war).

Fight for legislation that will stop requiring to run in the job rat maze to survive in the first place, to have a fighting chance; the alternative is a lot of suffering for everyone.

[–] [email protected] 2 points 11 months ago* (last edited 11 months ago) (1 children)

Fight for legislation that will stop requiring to run in the job rat maze to survive in the first place, to have a fighting chance

Here, here. Or is it hear, hear? Either way I completely agree, though I very much doubt we'll see something like that in our lifetime. Still worth fighting for though!

[–] [email protected] 2 points 11 months ago* (last edited 11 months ago) (1 children)

This could be a start:

https://en.m.wikipedia.org/wiki/Universal_basic_income

It's an old idea, already successfully tested in some places, in some more thanks to COVID, and just needs more general awareness and support... which I think the incoming AI transition might give it.

Would be nice to have it in place before it becomes widely needed, but we'll see how it goes.

[–] [email protected] 0 points 11 months ago (1 children)

I like UBI as a concept, but my immediate next thought is what happens if we don't simultaneously get rid of profit-driven corporations. Now we're post-scarcity and there's no more (compensated) human labor, but corporations are still in control and... well, there's no labor to strike, and the economy won't collapse anymore even if everyone starts rioting. Isn't there a danger of ossifying the power structures which currently exist?

[–] [email protected] 1 points 11 months ago* (last edited 11 months ago)

Rather the opposite.

With an UBI guaranteeing base needs, nobody needs to strike or riot... if they don't like how things are being managed, they can spend their time on creating their own corporation, some of which will perform better than the established ones and eat them alive... which is not even a problem for the previous corporation owners: worst case scenario, they'd fall back to an UBI level.

It would promote a much higher "class mobility", with the bottom class being "oh well, all base needs guaranteed", and the top class being "no limit", accessible to anyone with the skills, a real meritocracy. All the time allowing people who don't care about any of that, to pursue their own goals in life and let others play the corporation games.

There is also an argument that corporations structured around members voting on their decisions, could attract more members and get a much larger mass to overthrow older less efficient corporations. Or any other structure; it would become a real playfield for experimenting and optimizing corporate/political structures to optimize their performance, with no risk of anyone going bankrupt and falling below the UBI level.

[–] [email protected] 0 points 11 months ago* (last edited 11 months ago) (2 children)

Embracing AI and automation as tools can actually enhance your skill set and help you create more impactful work. If you're in a creative field, this technology can elevate your projects to a whole new level.

[–] [email protected] 2 points 11 months ago (1 children)

Telling people they aren't worth "keeping around anyway" is not nice. We only have one rule around here, and that's to be nice. Knock it off.

[–] [email protected] 1 points 11 months ago

Thanks for the heads up, I fixed my comment. Others could use that lesson as well.

[–] [email protected] 1 points 11 months ago* (last edited 11 months ago)

Would love to see how well this argument goes with people already being affected negatively by AI. For how great this tech is supposed to be it somehow only attracts the worst people to defend it, funny that!

Edit: Lol wait, maybe you already beat me to the punch. Great company ya got there

[–] [email protected] 0 points 11 months ago (1 children)

What do you think should be the alternative then?

The way I see it, you could 1) not have any models at all, which I think is shortsighted 2) hand over exclusive control over these models to big tech companies that have the money to pay these artists 3) make creative commons models that will probably never be able to compete with the big tech models. 4) Perhaps ban anything except creative commons models for personal use?

I'd much rather AI models were freely available to everyone equally. Best compromise I could see is developing some legally binding metric that determines wether the output you want to use commercially is similar enough to some artist, so you have to reimburse them.

[–] [email protected] 0 points 11 months ago (1 children)

Destroy all existing AI datasets, as they're irreparably tainted. Require all AIs, regardless of whether they're owned by a company or are open source, to build new datasets exclusively from work that is in the public domain or for which the copyright owner has been consulted and compensated. If the megacorporations want to keep the models they already have, they must compensate the creator of every single piece in the training data at market rates - if they can't afford to do it, then they either go bankrupt or destroy the tainted dataset. If anyone, company or individual, is caught training an AI with content for which they don't have a valid licence, issue fines starting with 10% of global revenue, to be distributed to the people whose copyright they violated. Higher fines for repeat offenders.

[–] [email protected] 0 points 11 months ago (1 children)

Wouldn't that make large corporations who own more copyright much more powerful and the small guys less powerful.

[–] [email protected] 1 points 11 months ago

Yes, but the solution isn't to allow everyone to rip off artists. Because that results in the small guy creators being even less powerful - instead of only having to be cautious in their dealings with large corporations, they now have to contend with every single person on the planet using their stuff without consent or compensation.

Even the large corporations that own a lot of content do not own enough to make a viable AI. These things take billions of images in the dataset to result in a model that's halfway usable. No company owns that many, and they'd bankrupt themselves trying to buy that many. That's why forcing them to pay is actually a viable solution. No existing company has copyright over billions of images.

Oh, and obviously the legislation would have to be written to explicitly not give the likes of Google the ability to claim that by using their services, you consent to them harvesting your content to train an AI. "Can't pay, can't use" would have to apply to all content, globally, in a way that can't be signed away through a sneaky ToS.