Generative AI

As for tracing, like, tell me you don't know how a lot about illustration and painting is done without telling me.

*sigh* Look. I know everyone is tired and constantly pissed off on the internet nowadays and this is a heated topic. But you don't have to become condescending. Now, I'm not going to assume malice in your comment, but stuff like that is the reason no one actually talks to each other anymore. :(

I have done both traditional and digital as a working professional. I know of where I speak on this.

So have I, but okay. I'll just keep out of it :/
 
But you don't have to become condescending.
Here was the part where I felt condescended to:
That's... romantic
Followed by some stuff that simply has no basis in how professional illustration is done or the history of it.

I apologize for my tone, but please assume I have some vague idea, after working in art fields for nearly 20 years, and now working on a committee regarding AI use in my current workplace, that I might have some clue what I'm talking about.

You basically said "that's cute". And I'm sorry, but no. No it's not.
 
I don't want to push more vitriol on this board since it's otherwise a very sane and safe place, but the "but Photoshop" argument might be my absolute least favorite way to defend AI as someone who was working on the (lack of) data privacy side of LLMs, and someone who is included in two lawsuits against two major LLMs because they pirated my life's work and fed it to their algorithm. I'm just going to calmly say I've been deeply, against my will, involved in watchdogging generative AI and while we as artists ABSOLUTELY complained about people photoshopping stuff, it's not even apples to oranges, it's comparing driving manual car instead of stick to stealing a space shuttle.
 
Phew.

My college holds a mandatory training session every quarter. This quarter's training was about AI. I don't think I've ever experienced so much emotional whiplash in a workday.

Much of what I heard was crazy-making. Our keynote: "AI mines content from countless sources, using the material to teach itself. It doesn't always pay for the content, but that's another story."

Is it? That's kind of fundamental to the story.

"No one on this campus is being replaced."

At best, this is willfully ignorant. No one is being replaced right now. The tech companies are selling AI as the end of labor. Either you disbelieve what they're selling or think you can stop using the technology before it makes our jobs redundant. There is no "I'll only use it a little bit" option. By using it, you're improving it. I'm dubious about whether it can do half the things they claim it will, but I'm not going to help them improve the tool that could send me to the unemployment line.

The second part of the day was an icebreaker. I couldn't tell you what we were supposed to be talking about because my table got into a heated argument about the ethics of AI instead. My best quality is my ability to stay even keel. Arguing with coworkers—many of whom I work with daily—about something I'm passionate about nearly had me flying off the handle.

This is one of those issues that I can't see the other side of. They ruined search engines. They stole creative labor. They set up the economy for another massive recession. All of this in the name of never paying for creative labor again. It's an affront to humanity.

From there, we went into breakout sessions. My first breakout session was about the ethics of AI. The session was packed. The speaker did a great job covering the environmental impact, the power imbalance, and the dangers of a single source controlling the world's information. It quickly rebuilt my faith in humanity. In the Q&A session at the end, the questions were thoughtful and overwhelmingly critical.

Ah, surely we're all just forced to be here, I thought.

Well.

The last session covered how to use AI effectively. Everyone in the room was asked how they use AI on a daily basis. I was one of maybe 15% of the people in the room who did not use AI at all. Apparently, most folks use it to clean up their emails. That puzzled me as much as anything.

1) Who cares how your emails sound?
2) What are we doing? Why are we stripping our emails of all personality? What is the point of being human if we're all sending the same sanitized email back and forth?
3) Is it worth using the Environment Destroyer 3000 to sound 10% more polite?

Being an academic institution, we also covered cheating. "Students cheated 50 years ago; students cheat now." What we didn't cover is how much easier it is to cheat now. Nor did we cover what this is doing to our critical thinking skills.

I could go on and on. It was emotionally exhausting.
 
I'm pretty sure at least part of this is AI generated, given how the logo on Spidey's back looks more like a tick than a spider.

 
This is not a new medium that executes the same thing faster or in a different format. This is an active removal of human decision making. There is no corollary between the two except one being a newer technology. Digital art did not, regardless of what some companies commissioning art thought, do the work for you at the click of a button. This effectively does. And this does so only because it has literally consumed the work of humans who did the work the hard way first. It cannot function without eating and devaluing the work of those who came before.
I have coworkers whose attitude is, "It's a time saver, embrace it".
I'm pretty sure my principal uses it for all of his communications. I'll go to him on rare occasions and ask something like, "How should I respond to this angry parent email", and he'll say, "Let me think it over and I'll send you an idea for a response". LOL, that idea is him typing it into AI. "You can look it over and edit it to make it sound like your voice" is some of the language I hear to ease doubts.....I'm just not onboard. I can type something to parents just fine on my own, I think.

Another coworker who is amazing and who I have a lot of respect for, is doing a play with my class based on a children's book. Well, we got a new student half way in and we were out of parts for her to play. My coworker asked AI to generate some dialogue based on this book and yes, sure enough, that student now has been inserted seamlessly into our play. Her made up part sounds just as authentic as the other kids who got a part from the book. That's handy......I guess.

It all just feels like a slippery slope to me. I'm in a rush one day, just don't have time to write this email, have AI do it quick "just this once", and before you know it, how many times am I turning to this thing that's "just a time saver" to get through the day? No thank you.
 
Apparently, most folks use it to clean up their emails.
The vast majority of real world uses for gen AI I've seen from coworkers (the few who use it) are basically this. A time-saver for tasks that, while not especially fun, are also not that hard. The other uses I see are for creative production in one form or another, and those are simply baffling to me because that's supposed to be the "fun" stuff. Or, at least, the stuff where half the point is getting to say I did this.
Being an academic institution, we also covered cheating. "Students cheated 50 years ago; students cheat now." What we didn't cover is how much easier it is to cheat now. Nor did we cover what this is doing to our critical thinking skills.
While not quite the same, the other use of AI I see, and this one is now all the time, is in hiring. It's obvious on my end because a majority of our applicants are international graduate students. Their applications and cover letters over the past couple of years have two qualities: one, they are all, suddenly, very grammatically clean for folks who typically have a harder time with written English, and two, there's now basically no actual information in their cover letters.

This is a real problem for them because it basically means hiring has gone from a process where I read applications carefully and look for bits of personal experience that might apply to our space (most folks have never worked in or around a makerspace before, so nobody has prior experience that way) to glancing at their previous positions and throwing a dart at a board. This is worse for them because the majority of international students at our university right now are in one of three majors and mostly come from not just the same country, but the same two cities in India. Their resumes are near identical already.

They don't realize it, but AI is making them much less hirable. And they WANT to be hired by me, because my space offers a tuition waiver that makes their tuition fees in-state. But of course, they think I care more about grammar than the actual information. It's a deep misalignment of what is qualitative in our space and it has made hiring pretty miserable. Of course, they assume we're just using AI to sort them all anyway, so why bother trying harder?
 
Another coworker who is amazing and who I have a lot of respect for, is doing a play with my class based on a children's book. Well, we got a new student half way in and we were out of parts for her to play. My coworker asked AI to generate some dialogue based on this book and yes, sure enough, that student now has been inserted seamlessly into our play. Her made up part sounds just as authentic as the other kids who got a part from the book. That's handy......I guess.
God god the depths to which this violates the children's book author's work and intellectual property just made me see red. Not directed at you, it's the lack of understanding on the part of the coworker. They fed someone else's life's work into a machine to rewrite it. It's so fucking violating and people have no idea. Imagine needing a stock photo of a kid and feeding someone's child's photos into AI and seeing what it spits out. It's like that.
 
God god the depths to which this violates the children's book author's work and intellectual property just made me see red. Not directed at you, it's the lack of understanding on the part of the coworker. They fed someone else's life's work into a machine to rewrite it. It's so fucking violating and people have no idea. Imagine needing a stock photo of a kid and feeding someone's child's photos into AI and seeing what it spits out. It's like that.
Yeah...... this.

I think there's good intentions here. To make a kid feel included. But I'd feel better about it if human hands and human minds had done the work to look at the piece of fiction and say 'how can we, human beings, understand this piece of fiction in order to creatively add something to it in a respectful way.' Adaptations are okay. Changes, especially for the sake of some kids' play, are okay. But it's the feeding it into an algorithm and letting a machine make the decisions thing that grosses me out and upsets me for sure.
 
Oh yeah, I see the good intentions. And pre-AI, the teacher, who is likely under-resourced and over-worked, would have had to make time to do the rewrite him or herself. But it ends up being a class warfare with under-paid, under-resourced people violating each other when really the villains ate these hoarding tech motherfuckers not paying enough taxes to make sure teachers ARE fully resourced.

But in the end, it's an under-resourced teacher violating the sanctity of the work of a person who is very likely also underpaid (only 5% of authors make a full-time income from their work - yes, even your favorite author likely has a day job, I had lunch with Paul Tremblay a few years back and he can't afford to quit his teaching job despite having multiple books optioned into feature films and being one of Stephen King's favorite writers). It's people hurting each other who don't have enough and we've hit the point where NOT understanding how AI works is not really excusable anymore. The education is out there. The lawsuits are public. Not understanding you're hurting other people doesn't mean you're not hurting them.
 
Agreed. I just don't know where we go with any of this. We definitely seem to have hit a breaking point in society where hurting other people, especially in less visible ways, is seen as an acceptable sacrifice for whatever conveniences we want. Not exempting myself there. We ALL make active decisions to do things that hurt other people and it's usually for entirely selfish reasons. I LIKE chocolate, so I'll pretend I believe the company saying it's not slave chocolate even though we ALL fucking know there's basically no chocolate on earth that isn't, in some way, slave chocolate.
 
Oh yeah. It's just the way of the world now. Amazon treats its employees like subhuman chattel, but without them I lose 95% of my book sales. I'm using Gmail still despite them being absolute scum. I'm 100% sure the laptop I'm typing this on was built with child labor. My old company, which made key components of COVID vaccines, got bought by Peter Thiel - my buddy still works there and his wife has always said "do whatever you need to do to support us, just don't take a contract with someone who bombs babies." But his company got bought out by a guy whose software is used to target civilians with bombs. Literally there is no ethical consumption anywhere. But all we can really do is just... like fuckin' try to be aware of it, I guess.

It'd be nice if we'd have even the most remote upswing toward a society where literally everything we do doesn't fuck someone else over. I was just saying to a friend over the weekend I can't remember the last time something good for the world happened. We just keep inventing ways to cannibalize each other.
 
God god the depths to which this violates the children's book author's work and intellectual property just made me see red. Not directed at you, it's the lack of understanding on the part of the coworker. They fed someone else's life's work into a machine to rewrite it. It's so fucking violating and people have no idea. Imagine needing a stock photo of a kid and feeding someone's child's photos into AI and seeing what it spits out. It's like that.
I don't know if I see it that way. We were going to write some part for this student one way or another. Whether she or I or a machine did it. We were going to put additional material into this play that wasn't in the book. Well books and other things get adapted all the time for plays and movies, some liberties get taken, some things get added, some things get taken out. Was Tolkien violated when the Hobbit movie added all those additional elements? Does it make a difference if a human or a machine added those things? (No, I promise you, our play is nowhere near Tolkien-level :) ).

My biggest qualm with the whole thing is the human element being taken out of everything. Saying, "Oh, you didn't need me to write that email or draw that picture after all." "I COULD put my heart and soul into my work, or I could ask a machine to do it". It's going to START OFF as "a good time saver" but very soon it will become much much more.
 
Back
Top