Generative AI

You're not incoherently cussing about it, which is where I'm often at, so you sound pretty restrained to me.
Honestly, I'm at the point where if I could just walk away from society and leave it to whatever it's about to become I would. I'm fucking tired.

And what sucks is if LLMs came for doctors or teachers or sales guys or marketing teams or plumbers or coders or farmers, the creative community would be the first people to step up and say "the fuck you are" and stand up for the other guys. In fact, it's authors and artists who are doing the most work to protect white collar workers and other professions from getting fucked by it but the percentage of people who cares is so depressingly small.
 
Honestly, I'm at the point where if I could just walk away from society and leave it to whatever it's about to become I would. I'm fucking tired.

And what sucks is if LLMs came for doctors or teachers or sales guys or marketing teams or plumbers or coders or farmers, the creative community would be the first people to step up and say "the fuck you are" and stand up for the other guys. In fact, it's authors and artists who are doing the most work to protect white collar workers and other professions from getting fucked by it but the percentage of people who cares is so depressingly small.
The fleeting satisfaction that at least you'll get to say "I told you so". Not terribly satisfying. I've been talking about the information warfare/deepfake angle on AI for years now and I think that one hasn't quite clicked for most people either. But it will. Eventually it absolutely will because that's the one thing Gen AI is unambiguously good at.
 
It's really difficult when so much of society is completely content to accept fabricated reality. In a generation, people will be begging to be plugged into the Matrix. An entire society of this guy, wanting to be told what steak tastes like rather than tasting for themselves.

joe-pantoliano-reflects-on-making-the-matrix-and-one-iconic-v0-avO2jK6gs89lLj_5zaIMvqfOlafhZjVLhKovN5mu77Q.jpeg
 
Remember when the promise of automation was that it will take over all the warehouse jobs and field jobs and bullshit we hate so that humans could live our entire lives expressing ourselves artistically and being in love with the world? Instead, we gave all the creative jobs to the robots and designed a system where all humans get to do is toil until they drop dead and those few that managed to put out any art are looked down on as nothing more than food for the algorithms.

It's impossible to overstate how badly we have fucked this up and how much it hurts to see how few people actually care. Or, in fact, how many people think this is all the way it should be.
 
Honestly, this is the confusion that's killing my entire industry. Again, not an attack - it's something MOST PEOPLE DO NOT UNDERSTAND. These AI algorithms have been fed our work, copyrighted, without our permission or compensation, so that massive tech companies can get rich off of it, and also without guardrails - yeah, your colleague did something benign and harmless, and in fact, if it didn't use a technology that wasn't already abusing her work to begin with, most authors would agree to it! But it's CONSENT, man. It's consent, to feed your HUMAN VOICE into a MASSIVE MONEY MAKING MACHINE that will then ALSO USE IT FOR ANYTHING ELSE ANYONE ELSE WANTS IT TO DO.
But what if, instead of AI writing that part, I went home and wrote that additional part myself, and gave it to my student for her to recite at the play? Is the author being screwed over somehow if I or my coworker did it? Yes, AI consumes these authors' works, but so do we. It's out there for anyone to consume, human or machine. I'm not trying to write a "gotcha" question here, I want to know what you think about this and understand it better. I don't know anything about legalities and nothing really about AI besides that it exists. Also, I don't know that it matters or not, but this play is being done strictly by students, for students. No money will be made, no admission charged. It's part of their curriculum as a public speaking exercise.
 
I think one point is it usually starts as 'harmless' but just leads to more and more reliance.

I dunno, one thing for me is friends and some of my wife's friends know I can do certain things but now they probably aren't coming to me to write anything or Photoshop stuff for them because why bother Sean when I can just have the AI knock it out. After I watch a couple hours of dumbasses on tik tok though.
 
But what if, instead of AI writing that part, I went home and wrote that additional part myself, and gave it to my student for her to recite at the play? Is the author being screwed over somehow if I or my coworker did it? Yes, AI consumes these authors' works, but so do we. It's out there for anyone to consume, human or machine. I'm not trying to write a "gotcha" question here, I want to know what you think about this and understand it better. I don't know anything about legalities and nothing really about AI besides that it exists. Also, I don't know that it matters or not, but this play is being done strictly by students, for students. No money will be made, no admission charged. It's part of their curriculum as a public speaking exercise.
I think you're getting lost here in the relative harm of a random teacher making a decision to do this for a school play, which I think most authors wouldn't find especially harmful, and the mechanism by which this particular alteration is being carried out.

That mechanism is the primary issue. And it is part of a larger system that is being actively, intentionally, aggressively used to marginalize and devalue all human labor, not just artists. Teachers themselves are also on the block. The relative harm to an author of an individual doing this for personal use is indeed, almost nothing. If they do it by their own hand, even less. But using this system integrates it further into culture, workforces, and regular practice, and that's the thing that will endanger everyone. It's like climate change that way. Starting your car on any given morning contributes but a tiny amount, but the fact that we've built an entire nation around cars such that few other options are feasible has locked us into an environmental death spiral.

Once you integrate that process it starts to become a linchpin. Other systems become built on top of it. It becomes necessary. That's the problem with it.

And look, the car culture comparison here is intentional. America was not designed with cars in mind. That was a thing that people made happen over the course of decades of legislation (look at the history of jaywalking laws for example). That series of decisions was largely made before we were born and would require just as many decades to roll back. We could just as easily have made a nation built around public transit instead if we'd wanted. This is the era of the Model T for AI. The artists are telling you what this will look like 20 years from now. That's why these small things do still matter. It's the electorate who chooses how society does things. But the electorate isn't good at realizing it's a frog in a pot.
 
Remember when the promise of automation was that it will take over all the warehouse jobs and field jobs and bullshit we hate so that humans could live our entire lives expressing ourselves artistically and being in love with the world? Instead, we gave all the creative jobs to the robots and designed a system where all humans get to do is toil until they drop dead and those few that managed to put out any art are looked down on as nothing more than food for the algorithms.
We wanted Star Trek and somehow we're instead just returning to the coal mines. And I think at the end of the day the reason is sort of grossly simple. Star Trek is a world where nobody has absolute power, and the owner class can't have that, we can't even move appreciably towards it.
 
But what if, instead of AI writing that part, I went home and wrote that additional part myself, and gave it to my student for her to recite at the play? Is the author being screwed over somehow if I or my coworker did it? Yes, AI consumes these authors' works, but so do we. It's out there for anyone to consume, human or machine. I'm not trying to write a "gotcha" question here, I want to know what you think about this and understand it better. I don't know anything about legalities and nothing really about AI besides that it exists. Also, I don't know that it matters or not, but this play is being done strictly by students, for students. No money will be made, no admission charged. It's part of their curriculum as a public speaking exercise.
Honestly that part about going home and doing it on your own? Truly harmless. Only the most litigious, asshole authors on earth would get mad at you for that. Hell, if someone converted my books into a high school stage play I'd tell people about it myself on social media. The problem is when it goes broader than that - all of the major AI options, anything you feed into it becomes part of their "soup" other people can generate stuff from. It's taking protected work and putting into a massive machine owned by the worst people on earth, or using that machine owned by those people who ALREADY stole our work and giving it to your colleague (who, because the AI companies don't tell you this, innocently has no idea they're using stolen goods).

Think of it like: you have your class draw fan art of the Hunger Games. No big deal. Hell, Suzanne Collins would love it. Vs. you ask Ai to rewrite the Hunger Games to, I dunno, edit out a character you don't like. Or create a second sister for Katniss. And then that become part of that algorithmic soup that neither you nor Collins owns. By using AI you've invited that bigger, for-profit player into the equation.

Last year the writers guild polled members - if you could make X dollars by licensing, with your consent, your work to a program like OpenAI would you want to get paid for it? And 95% of respondents said no. Authors don't want their work in that machine, and all htey're asking is to have that desire respected.

(Mind you, we also have to be assholes about it - if you don't defend your IP when it's misused, eventually the courts view it as that you are not willing or interested in defending it - it's why you'll see up and coming artists or comics issue rules for legally creating fan art once they hit a certain point. You legally need to safeguard your work or it's considered negligence. But OpenAI, Meta, Google, and Anthropic all went out and grabbed everything they could and figured it was better to ask for forgiveness than permission. And that's why they're losing lawsuits.)

I's funny, just today news broke that Meta was torrenting pirated books and their own staff knew it was unethical. If I sound grumpy and short-tempered, it's cos we get bad news about this every day. And truly we were all hoping generative AI would be like NFTs - sure, I hate NFTs but they weren't HURTING anyone. But AI morphed into something so much worse for creators.
I think you're getting lost here in the relative harm of a random teacher making a decision to do this for a school play, which I think most authors wouldn't find especially harmful, and the mechanism by which this particular alteration is being carried out.
This is a good one-sentence summary of it, yeah. To do it once in a classroom for education and fun, using your human brain? Like I said, I'd brag about it. (I actually had a class recreate one of my fight scenes using stage combat a few months back and I shared the video with my readers, it was fuckin' RAD.) But if they used AI to create a fight scene of my characters in a video format I'd have been devastated and, honestly, probably would've legally needed to send a C&D if they didn't take it down.
 
Back
Top