Scriptnotes, Episode 669: They Ate Our Scripts, Transcript

The original post for this episode can be found here. John August: Hello, and welcome. My name is John August. Craig Mazin: My name is Craig Mazin. John: You’re listening to episode 669 of Scriptnotes, a podcast about screenwriting and things that are interesting to screenwriters. Today on the show, the revelation that many of […] The post Scriptnotes, Episode 669: They Ate Our Scripts, Transcript first appeared on John August.

Jan 14, 2025 - 15:44
Scriptnotes, Episode 669: They Ate Our Scripts, Transcript

The original post for this episode can be found here.

John August: Hello, and welcome. My name is John August.

Craig Mazin: My name is Craig Mazin.

John: You’re listening to episode 669 of Scriptnotes, a podcast about screenwriting and things that are interesting to screenwriters.

Today on the show, the revelation that many of the biggest AI models have been trained on film and TV dialogue has writers up in arms. How should we think about this moment-and-coming AI fights? We’ll discuss the options. Plus, we’ll have listener questions and feedback on contracts and bailing on a project.

In our bonus segment from premium members, Craig, you frequently say that we are living in a simulation.

Craig: Yes.

John: Does that mean that you are a theist who believes in a creator? We’ll discuss the philosophical implications of this dynamic.

Craig: Fair question.

John: All right, fair. First, we have some follow-up. Drew, help us out. Let’s go back to episode 666 a few weeks ago where we talked about satanic movies.

Drew Marquardt: Steve writes, “I have a slightly more detailed answer to Emily’s question about the difference between thriller and horror. Thrillers scare us with the fear of death, usually in a gruesome manner like being cut with a knife or slashed by the claws of a beast. I would say that slasher is just a subgenre of thriller that is maximally bloody and usually involves a maniac with a blade, hence the name.

Horror films often involve the fear of death, but more importantly, the fear of losing your humanity or soul. Being turned into an undead vampire, werewolf, zombie, et cetera, is its own type of death.

As John pointed out, the first alien movie was horror in space because the thought of being turned into a host for an alien offspring and being alive while it’s growing inside you is a true horror, and then the darn thing is born and it’s game over, man. Just losing your humanity like Kurtz in Heart of Darkness and Apocalypse Now is enough for him to utter the famous line, ‘The horror, the horror.'”

Craig: I appreciate the thoroughness of this theory, and I like the way it’s circled back around to Heart of Darkness, but yeah I don’t know if I agree.

John: I think any time you’re trying to establish a clear taxonomy between genres, between categories of things, you’re going to run into some messy things. What I like about what Steve did here is he talked about there are a lot of movies that are clearly thrillers that are not horror films, and they involve peril in a way, and sometimes physical peril, but sometimes it’s getting your adrenaline up in those ways, versus horror films, which there’s sort of a seeping dread quality to horror that is different than what you find in a thriller necessarily.

Craig: Yes, I think it was just a little too narrow on thriller because thrillers adrenalize you in so many different ways. They don’t always involve the fear of being slashed or dying.

John: There’s a peril, something’s in threat, but it’s maybe not your own life.

Craig: Right. Did I mention that movie, Flightplan, last time?

John: Oh, yes, we did.

Craig: I don’t know why I keep coming back to Flightplan of all. Because the thing is, it’s a great idea for a movie. It wasn’t my favorite execution, to be fair, but I love the concept of it, and that’s a great thriller. Someone’s gaslighting you into believing that you didn’t have a kid, but your kid is lost. There’s no fear of death there. You’re not afraid of your own life. You’re more just– it’s a paranoia thriller.

Drew: It’s a remake of a Hitchcock movie.

Craig: Is it?

Drew: The Lady Vanishes.

Craig: You’re kidding. I never put that together.

John: Sure.

Craig: Oh, you know what, everything comes back to Hitchcock.

John: It does all come back to Hitchcock.

Craig: He’s very good at thrilling you.

John: Let’s talk about some generational narcissism. LaWant wrote in with us.

Craig: Wait, I need to know if that’s– sorry, is that narcissism? Oh, I remember.

John: Yes, I think you made up that term last time.

Craig: Okay, because when I heard it, when you just said it, I thought, well, somebody’s narcissism is so profound. It’s like a generational narcissism. Once every 20 years, someone is so narcissistic. Okay, let’s talk about generational.

John: Once every 20 years, there’s a generation born that is narcissistic.

Craig: Now we can talk about the generational narcissism.

Drew: Yes, this one had to do with everyone thinking their generation was the last or the end of the world.

Craig: Yes, of course.

Drew: The last episode, Craig was looking for a word or phrase to describe how every generation assumes they’re the last one. He came up with generational narcissism. Here’s my suggestion for another one. Temporal solipsism. We can see the past, but we can’t see the future, so part of us assumes it doesn’t exist.

Craig: There’s a running theme here. People are just complicating stuff that we’ve said.

John: Absolutely.

Craig: We’re actually pretty good at this. We did a very good definition last time, I think, of thriller and horror. I think generational narcissism is a little more accessible than temporal solipsism. Solipsism means nobody else exists.

John: Yes, that’s the problem. I think the challenge with solipsism is like me as an individual is the only thing that has meaning or could ever be known. Really we’re talking about a cultural sense that we are all together at the end times. That we are the last generation.

Craig: Thinking that you’re somehow special or important is not solipsistic. It’s narcissistic. I stand by my words.

John: All right.

Craig: That said, we encourage feedback.

John: Yeah the subtle distinction between solipsism and narcissism is something we’ll get into in Episode 1053 of Scriptnotes.

Craig: You say that, and then what’s going to happen is we’re going to get there.

John: Absolutely. Someone’s taking a note right now. “You said you would do this in 1053.”

Craig: “You guys.”

John: “You guys.”

Craig: “You guys.”

John: Unlike Craig, I do recognize that people do listen to the show.

Craig: I had no idea.

John: We got an email from a mutual friend who was talking about running into another big-name writer who referenced a very specific thing mentioned on one specific episode of Scriptnotes.

Craig: Can we just say who it is?

John: Yes, we can say who these guys are.

Craig: You could say who both of them are. It was Taffy Brodesser-Akner, and she ran into the living legend, Tony Gilroy. Now, I’m still suspicious. I don’t think Tony Gilroy he listens to– I don’t know.

John: He listens to at least the Moneyball Episode because he referenced a thing that was specifically mentioned in the Moneyball Episode.

Craig: Somebody probably said, “Hey, go listen to the Moneyball Episode.” I can’t imagine that Tony Gilroy was like, “Hold on, let me–“

John: How do they have time?

Craig: “I got to put Andor on pause for a second, listen to a debate over what makes a thriller and what makes a horror movie.”

John: Now, there’s an equally valid way of saying Tony Gilroy was eating at a restaurant and ran into the legendary Taffy Brodesser-Akner.

Craig: Yes, although by her telling, it seemed more the other way.

John: That’s because it was her telling him. Therefore, she’s always going to place herself in the inferior role to someone she admires.

Craig: Opposite of generational narcissism. Generational core shame. I’ve never actually met Tony in person. I’ve been on some email chains and things with him, but I do know his brother, Dan. I’ve spent a little bit of time with his brother, Dan, who’s a lovely guy and also brilliant. Some pretty good storytelling genetics over there in the Gilroy clan.

John: I guess so. They didn’t grow tall, but they grew smart.

Craig: They’re not short, as far as I can tell. I don’t recall them being short.

John: No, but I would say they didn’t grow tall genetics. Let them be like, “Oh, they’re a family of basketball players.”

Craig: No. No, they are not. This is rarer, to be honest. Tony Gilroy, that guy’s good.

John: He’s good.

Craig: Oof.

John: A thing you learn all now on the Scriptnotes podcast is that Tony Gilroy, the Emmy Award-nominated and Oscar-winning probably.

Craig: Wildly celebrated. Do you think that he’s just finally figuring it out now, listening to us like, “I am good”?

John: “Wait, I am good. This inferiority complex I’ve been carrying around this entire time, this imposter syndrome that I’ve been living with, maybe because John and Craig are saying, ‘Tony Gilroy, you’re good.'” This is a podcast about how good Tony Gilroy is.

Craig: It is now.

John: It is now. Let’s do some more follow-up on how Hollywood got old. This was Episode 664. We were talking about how there used to be these young studio heads and you just don’t see young people running Hollywood anymore.

Drew: Yes, and so the one episode I was gone, Craig, you talked about the lack of ambition amongst young people in Hollywood today.

Craig: You timed it perfectly.

Drew: Scriptnotes the producer.

Craig: Yes, because you just weren’t ambitious enough to show up that day. [chuckles]

Drew: Clearly. Well, a few of our listeners had my back.

Craig: Okay, here we go.

Drew: Alyssa wrote in. She said, “I just turned 37 and while I would describe myself as incredibly ambitious my whole life, my hardcore f the rules career ambition took off only a couple of years ago. The reason this has come so late is simple. Student loans. Unlike the generation of hustlers before us, we also had monthly loan payments of $1,200. To cover this, I worked two jobs, one full-time and one part-time at night. These loan payments almost completely exhausted the ambition out of me. I did manage to get into a production company by swinging one day a week as an unpaid intern, but they cut my position in favor of those who wanted it more because they could afford to put in more days.

Everything changed when I married a man with a steady teaching job and parents who could afford to send him to college. As soon as I was able to share finances, I could drop down to one job and just like that, my career took off. Suddenly, I’m proud of the ways I’m figuring out how to get my work out there despite a slow market. I’m not waiting. I’m grabbing the industry by the throat in all the ways I couldn’t 10 years ago.

I’m not giving you excuses. I’m simply pointing out one reason why my generation may look stunted to those older than us. The drive is there. The ambition is there. But many of us are slaves to a debt we didn’t realize we’d be paying for the rest of our lives when we took it on in 17.”

Craig: I love when people say, “I’m not giving you excuses.” Here, however, is a reason why — that’s called an excuse. There’s nothing wrong with excuses. Why did that become a bad word?

John: I know. Why did excuses become such a pejorative? Excuse is an exclamation.

Craig: You’re excused. It’s like you’re pardoned of a crime. That’s what an excuse is. I’m sure this is what she was hoping the answer would say. Did you have student loans, John?

John: I did not have student loans, but I went to an inexpensive school.

Craig: I had student loans. I don’t know why. The premise of this seems to me that student loans just suddenly popped into existence or something. They’ve been around forever. I had student loans to pay off. They’ve always been there. The cost of education has gotten insane. Now, some schools, my alma mater, for instance, have eliminated all loans. Whatever you can’t afford, they just grant you. There is no more loans. In my case, I had to work and pay off loans. Sometimes when we talk about these things, there’s a temptation for somebody to go, “Whoa, I’m being judged.”

“If I’m not in charge of a studio, then you’re telling me that’s my fault because I’m not ambitious.” That’s not why. Here’s why. Almost no one can be in charge of a studio. I just want to be clear. This is not about you, this is about us in the aggregate.

John: I think we’re also talking about slightly different things. We’re talking about aspiring screenwriters versus aspiring like, “I’m going to run a studio.” One thing is that I think we were– I don’t remember exactly what we talked about, but the same young people who were running studios back in the day, I think are not working in this industry. I think they’re working in tech and they’re working in other places.

Craig: That may be true.

John: I think that’s the missing piece that I’m finding here.

Craig: A lot of variables, but I think part of the problem is a self-perpetuating cycle. When you look and see who’s running a studio, that’s who you presume should be running a studio. In this case, it’s a bunch of people who are our contemporaries. Donna Langley, for instance. People who are 23 are going to look at Donna Langley and go, “You’re supposed to be Donna Langley’s age when you do this, you’re not supposed to be mine.” There did seem to be a little bit more flexibility and attraction to wunderkinds.

Another thing that probably made a huge difference that has nothing to do with ambition is how Hollywood is owned. Because when we entered the business, a lot of these studios were still their own companies. They hadn’t become the massive international multi-conglomerates. In that case, risk aversion starts to set in. If you’re just Columbia, why not? Wing it, go for it. If you are part of the Sony Corporation, maybe not.

John: It’s also reminding me of the conversations we had around Pay Up Hollywood and all the issues of those entry-level jobs being so woefully underpaid in Hollywood and the work that we did to try to make sure we were increasing those two survival wages is that the two jobs Alyssa was taking, she should have been able to get one job in the industry that was able to cover her rent and give her the experience that she wanted. Increasingly, for a period of time, and still today, it’s really challenging to do that. The people who can afford to take those jobs, that’s not the breadth of people we would love to see rise up in the industry and kick ass.

Craig: Yes, I completely agree. Life is complicated now. There are a lot of bills that you and I never had to pay. We never had an internet bill. We weirdly had phone bills. They were so much cheaper than cell phone bills.

John: We also had long distance though, which is a weird thing to pay for separately.

Craig: That’s why we never called anyone, or that we would make all our calls at work. “Press nine to get an outside line.” Oh yes, sneaking in– did you ever get in trouble for making long-distance calls at work? I did.

John: I did not. But I do remember a friend calling me who had figured out a scam long-distance calling card number. He was just calling me because he didn’t really necessarily want to talk to me. He just wanted the scammability.

Craig: Free minutes?

John: Yes, free minutes.

Craig: At three minutes, I got to go talk to somebody or I’m wasting my crime. I remember getting called into the office in my first workplace, just a small advertising company. They were like, “Your extension, you’ve called a number of these, and it’s added up to $40 or $50.” Which, as a percentage of my weekly salary, was significant. It was a real problem.

John: Now Drew, does any of this resonate with you? Because you grew up in a time post long distance, but you were living overseas, so there probably were still costs for calling home.

Drew: I’m trying to think. No, I had Skype by the time I was overseas.

Craig: Skype?

Drew: Skype was basically free.

Craig: What were you stealing from work then?

Drew: Pens.

Craig: Pens? Physical pens?

John: Yes, it’s just not worth as much.

Craig: Drew, you might’ve been stealing funds. Just fully embezzling.

Drew: Yes, just absolutely.

Craig: Funds? I was stealing funds.

John: We had a writer from Australia write in to say that the opposite phenomenon was happening there.

Drew: Anonymous Down Under says, “The situation here in Australia is an interesting flip of this. When the major international streamers all set up shop here over the last three to five years, they uniformly put young, relatively inexperienced people in charge of their Australian branches. This in turn uniformly pissed off all the established producers and creators because they felt, sometimes legitimately, sometimes not, that they were pitching to someone much more junior than them.

On a more existential level, we had all these Gen Xers suddenly terrified that they had been superseded before they’d had a chance to achieve anything. As it turned out, all the major greenlight decisions still got made out of the US anyway, and everyone got used to the idea that a young person might actually have some good ideas after all.”

Craig: Well, damned if you do, damned if you don’t. Obviously, everybody’s cranky about everything. One of the things about a limited resource industry is that people will immediately start blaming each other for the reason why they’re not getting the resource. The reason they’re not getting the resource is because there aren’t anywhere near enough. In this case, we’re talking about writing jobs or getting a show on a streamer. It’s a one-in-a-million shot anyway. Yes, you could blame the young person. You could feel it’s an indignity. I think if you’re in Generation X and you’re saying, “This has happened before I even had a chance to do something,” you’re in your 50s. We got to go start to shuffle aside for the kids at some point.

John: The first time you’re working with someone and for somebody who’s younger than you, it’s a little bit jarring, but you get past it, you get through it.

Craig: I also think that if somebody’s smart, it doesn’t really matter. I think it’s cool. I also think sometimes when I’m working– I’ve been in situations where I’ve been writing something and there’s a couple of executives that- actually, all the executives that I work for at HBO I think are a bit younger than me. One of them is very young. I never think like, “This is nonsense.” No. I just think sometimes it’s a benefit because when I was 26 and the person I was working for was 50, they looked at me like, “You’re a kid.” I looked at them like, “You’re my dad.”

Now I think sometimes people that are younger are like, “Oh, here’s the calming older presence here who’s been around a lot.” It’s a little harder for them to say, “You don’t know what you’re talking about.” I don’t mind it. Do you have any weirdness at all?

John: No, I think sometimes I need to watch what I’m saying that in no way sounds patronizing or it sounds like, “Young whippersnapper, you don’t know what you’re talking about.” That I know what I’m doing here. Also I feel like they’re coming to me with the expectation that I do know what I’m doing in these circumstances.

Craig: I do think if you trotted out “Young whippersnapper,” they wouldn’t even know what that means.

John: Yes, absolutely. Completely.

Craig: “Sorry, the what now?”

John: Absolutely. Monty Burns is sort of a–

Craig: The jumping on TikTok, “What is whippersnapper?”

John: Hope back in my stagecoach.

Craig: Even the fact that I said jumping on TikTok. God.

John: Cringe.

Craig: If my kids could hear me now, they’d barf.

John: There’s really nothing more cringe than cringe though.

Craig: Cringe is the cringiest. We’re recording this the day after Thanksgiving.

John: Yes, so this will come out two weeks after.

Craig: Is Amy home? Did you have Amy here?

John: No, it’s so bizarre to have my kids going to visit her friends in the UK because like, “Oh, it’s just a long weekend, so I’m going to go visit her friends in the UK.”

Craig: My youngest daughter, Jessica, is here in town and we combine Thanksgiving with another family and they have three daughters. One is in the UK, but the two that came are both high school age, senior and freshman, I think. I’ve never felt older in my life. I’ve actually gone so far around that I’m kind of cute. It’s funny how out of touch I am. They like it.

John: It’s always fun when she’ll like drop a name of some celebrity and it’s like, “Do you know this?” I could just quickly Google and provide context, but I will honestly answer like, “I have no idea who that person is.”

Craig: That’s cool. I think sometimes if you try, that’s where it gets cringe. Stay in your lane. Stay in your lane, dad, be dad. They kind of want that.

John: All right. All right, well, let’s get me fully back in my lane here because we have some AI to talk about. AI and screenwriters to talk about. This all blew up, now as you’re hearing this a couple of weeks ago. This is Alex Reisner writing for The Atlantic, has this article saying, “I can now say with absolute confidence that many AI systems have been trained on TV and film writers’ work, not just The Godfather and Alf, but more than 53,000 other movies and 85,000 other TV episodes.

Craig: Sorry, did he say not just The Godfather and Alf?

John: Yes, he was trying to provide, I think, the broad edges of the framework, or maybe that was related to the prior paragraph which I omitted.

Craig: Oh God, I hope so, because what a weird way to just start.

John: What a lead.

Craig: “Not just The Godfather or Alf.” Okay, fair.

John: “These models have been trained on more than 53,000 other movies and 85,000 other TV episodes. Dialogue from all of it is included in the AI training data set that’s been used by Apple, Anthropic, Meta, Nvidia, Salesforce, Bloomberg, and other companies.

Craig: Great. Great. Oh, fantastic.

John: You might think like, “Oh, they just scoured the internet and they found all the screenplays,” because you can find screenplays for everything, but instead, this is actually taken from opensubtitles.org.

Craig: I had a feeling.

John: What they do is, they extract subtitles from DVDs, Blu-ray discs, internet streams. Sometimes they’re just using OCR to actually see what’s on screen, and they’re uploading to this big database so you can find the subtitles for whatever episode or thing is. You can criticize that for existing.

Craig: Sure.

John: But it’s also useful for translations for people who want to see things in other languages. It’s out there in the world. Basically, these models sucked it up and used that for training data, and you can see why it’s useful for training data, because it’s just dialogue, it’s just people speaking to each other. You have the context for what it is. It doesn’t have all the other goop around it. It’s well-formed. Honestly, our podcast is two people talking to each other. It’s probably useful for training data for stuff.

Craig: Great. Can we get them working? Can we get that going for next week?

John: I want to talk about this legally, ethically, philosophically, and how we as writers probably do feel about it and what things can be done about it.

Craig: That second question’s the fun one, isn’t it?

John: Let’s talk about your emotional reaction to this and what this makes you feel like.

Craig: Well, I think I’ve probably felt all the immediate feelings in the past. What I feel like now is a sense of general resignation. I feel like the guy in Tiananmen Square, “No, tanks, stop.”

In the end, people who are only familiar with that photograph don’t realize that, no, I don’t think that man died, but the protesters lost and lost permanently. I don’t know how to stop any of this. I don’t think it can be stopped. We are probably baited into arguing about it and then AI will take transcripts of our arguments and learn from them.

John: I think a lot of writers and some writer friends of ours– Robert King was on some podcasts talking about how he was feeling about it. I think a lot of people are in those earlier stages and they’re feeling a lot of the feelings. I want to talk about the feelings. I think the feelings are valid, and then also talk about what can actually be done and how not to get baited into the wrong fights over it. Let’s start with, I think a lot of writers feel angry. When you hear why they’re angry, they’ll say, “It’s theft. This is theft.” If someone steals your car, that’s theft. If someone makes a bootleg copy of your movie and sells it, that’s copyright infringement, which could be a criminal act. There’s also civil penalties for that.

As we’ve talked about on the show, when someone steals your idea for a heist film set during the Iditarod, that’s not really theft in the same way. This could be closer to that third thing where it’s like they’re not taking your– as we described, unless you are actually taking the expression of those ideas rather than just the idea itself, unless you’re using that expression of ideas and showing that stuff, it’s going to be very hard to make a case against it.

Craig: Well, when people talk about theft, who do what we do, my general response is, you’re talking about somebody stealing something you don’t own because you gave it away because you took the money. What we do, we don’t own the copyright and the companies do. It’s their property.

John: It is.

Craig: This came up when Napster came around back in the late ’80s, early ’90s. Then following that, all the file-sharing services like LimeWire and so forth, and then BitTorrent. Everybody was panicked that everybody was going to steal everything. Writers were upset that their residuals were going to go away. I just remember thinking, “Well, if the companies that own this stuff don’t care, then it’s all over.” But generally, they do.

John: They do.

Craig: This is one of those times where I think we get to hide behind the monster we’re usually fighting, because if there is some compensation for this, it’s the studios. They’re going to have to figure it out. Problem is some of those studios, I think, don’t care. Apple, I don’t think they care. I don’t think they care. I don’t think Amazon cares. I think they’re probably into it. I think they’re probably sitting there going, “Well, what if we could replace all these people?” If that happens, if the studios are willful collaborators in this theft so that they can enable the tech industry to replace all the humans, then nothing matters anyway. It’s over.

John: A model of an industry coming up and pushing back against this, we were listening to those examples of songs that were generated from AI models that listen to a bunch of songs and could recreate it. Give me something that feels like a surfy kind of thing. It’s like, “Oh, that’s exactly a Beach Boys song.” It has a lyrics of a Beach Boys song. Those examples are so clear cut, much harder to find examples of that in our texts. Doesn’t mean we won’t happen, but it’s harder to do this. That’s going to be the interesting thing if they decide to go after it, which they might.

Craig: For the case of songs, artists do own the copyright to the publishing, to the lyrics and the music itself, not the recordings, although some artists do. It’s a more complicated situation. Individual stars can go after these people, I suppose, like Taylor Swift could probably do that. If people are going to go through Big Fish and they’re going to go through The Last of Us and they’re just going to scrape it and teach it to a thing so it could write Big Fish 2 or a Last of Us spinoff, if HBO or Sony, Warner Brothers or Sony, if they don’t care enough to stop that from happening or sue somebody, it’s happening.

John: Yes. Individually, we’re not going to be able to do anything about it. Let’s talk about a different thing which gets conflated with it, which is plagiarism. Vince Gilligan, who’s on the show, was a great episode when he came to speak with us. He described generative AI systems as basically, “An extraordinarily complex and energy-intensive form of plagiarism,” which is such a great quote for this. Plagiarism is interesting because it’s not a criminal thing. Plagiarism is a moral thing. It’s a set of rules we’ve agreed upon. Institutions will have ways to define plagiarism and enforce them.

Plagiarism is generally representing someone else’s ideas as your own without proper attribution. If you could put a quote in from somebody, that’s great. You take away those quotation marks and the citation, that’s plagiarism. It’s useful to think about these AI systems as if you were to use them to generate some text, it could be plagiarized and you’d have no way of knowing that it was plagiarized. You’d have no way of actually checking to see what that is from. It could string together the words that are actually someone else’s expression of that thought and idea and it’s really hard to know where it came from.

Craig: Which is also the case with regular plagiarism.

John: Yes, it is.

Craig: Plagiarism is immoral for that very reason. AI doesn’t pretend to not be plagiarism. They advertise their plagiarism. That’s the whole point.

John: I would say the plagiarism though, again, it’s the taking someone else’s idea and saying that it’s your own.

Craig: Which they do. Because look, when the Beastie Boys put out Paul’s Boutique and they originally had Paul’s Boutique, they just didn’t credit all the 4 billion samples they made. Everybody was like, “Yo, there’s A, the legal question of whether or not you can use this. B, you’re kind of pretending you made this.”

John: To me, Paul’s Boutique though, there’s a legal question there because of sampling. Because you could say this is directly–

Craig: It was both. There was a sample there and that was a whole legal thing, and they did have to end up crediting all these people. There was also just an ethical, plagiaristic question. Do the Beastie Boys, are they representing that they came up with this groove? Are they out there saying– Look, now, Paul’s Boutique’s awesome. They didn’t want to plagiarize and they did say, “Okay, sure, we’ll do all this.” They were young and they didn’t really care. I think that, yes, AI is essentially plagiaristic because the detailed training– when you say, “Okay, I’m going to feed you every Robert Frost poem. Now, give me a Robert Frost poem.”

John: It gives me the Robert Frost poem. The generation of that fake Robert Frost poem is the plagiarism.

Craig: Yes. Correct.

John: It’s the output that is plagiarism, not the input that’s plagiarism.

Craig: Correct. It’s the output.

John: That’s one of the decisions I want to make here is that training the model it may not be plagiarism. It’s the outputting anything from it.

Craig: It’s the output. No question. No question. Now, if AI had an ethical component to it, which would have to be imposed by law to identify everything that it did as AI and to say, “This is not a Robert Frost poem, or somebody that’s writing poetry that sure is awesome like Robert Frost, but rather this is an AI emulation of Robert Frost,” fine. I get that. I think that’s probably not plagiarism because it’s about acknowledgment.

John: Well, except that if I say it’s not a Robert Frost poem, but it would say like you’d have to cite the source of where it’s coming from or at least–

Craig: I don’t think so. I think that like specific citations is about academic rigor. The key with plagiarism is to say, “I’m acknowledging that I borrowed this and this rather,” than trying to pass it off as my own.

John: I get that.

Craig: If you acknowledge it, I think you’re out of plagiarism town and you’re also opening yourself up for people to properly evaluate and say, “You didn’t actually just do this by yourself. You read every single thing and then did this.” I think, honestly, if a human reads every Robert Frost poem and then writes a poem-

John: In the style of Robert Frost.

Craig: -as an homage, that’s not plagiarism. But the fact is there is not a human involved. Since it is only the text and nothing else, no life experiences or anything, it just gets much clearer to me that it is.

John: All right. Getting back to the feelings of all this, we have, “This is theft, this is plagiarism, or this is training something to be a replacement for my work.” That I described initially as the Nora Ephron problem. Imagine you fed all of Nora Ephron’s scripts into one of these systems and say, “Now give me a new Nora Ephron script.” That feels really wrong. It will continue to feel really wrong for me because you are taking a writer’s work and generating just a fake version of Nora Ephron in a way that’s calculated and it feels gross and Nora Ephron is no longer alive to be competing, but like I am alive and you are alive.

If they say like, “Here are all these John August scripts, give me a John August script,” I’m suddenly competing against a version of John August who can work 24/7 and generate a million different scripts. That’s unfair competition. That’s what–

Craig: It’s not competition at all. You’ve lost. This is where I stand aside, I think from a lot of people when they’re like– because the silent phrase that is in front of, “They’re training, our own replacement” is “You don’t understand.” Oh no, I understand. What am I supposed to do about it? There’s nothing I can do about it. we can all be John Henry and like, “Look, I can pound these railroad ties,” or whatever he’s doing as fast as that steam engine. John Henry died at the end of that story. Steam engine goes on pounding the railroad spikes.

John: John Henry is the Tiananmen Square guy.

Craig: We are all John Henry here. There’s nothing we can do. People say these things like, “If only people understood that we were training our own replacements, they would rise up and…” What?

John: What would they do?

Craig: Yes. Like when you say it’s calculated and it feels gross. Yes. That’s what corporations do. That’s how we got Lunchables.

John: You just described capitalism.

Craig: That’s the whole thing. That’s why they’re successful. They don’t have the qualms that regular people have. If it’s going to happen, it’s because it’s what people want. In the end, this is all driven by a marketplace. If people go, “You know what, actually, I’m fine. Oh yes, give me AI Friends. It’s fine, I’ll watch it. It’s fun. It’s almost as good as the real thing. In fact, it’s better.” Then we’re done.

John: I want to separate two things out there. Giving me AI Friends, our work isn’t just being trained to create the fake versions of what we do. It’s actually being trained so the models can do all the other stuff. Like having Alexa be able to speak back to you in a more natural way does come from all the training that’s been done on dialogue. It’s not just about directly replacing the work that we’ve been doing. It’s part of a bigger–

Craig: Yes, also we may encounter something that AI does that was prompted as “Give me a romantic comedy written in the style of John August,” that you will watch and not know it was prompted by that.

John: Oh, totally.

Craig: It will seem original even to you. If these things are to pass, then it’s over. The whole reason copyright law exists in the first place is to protect artists so that there can be some innovation. The best argument that we can probably make against AI at some point is if you do this to the extent that this is no longer a job, you’re going to run out of stuff to train them on. They’re just going to turn into a loop of self-training and it will flatten out and go nowhere.

John: Maybe, and that’s a strong possibility, but it’s a question of when does running out of that data really slow the progress and is there a different way that they can progress beyond that? Because at a certain point it may not matter that much.

Craig: Then it really doesn’t matter.

John: Well, summarize for– I want to validate and sit with what it feels like to be a writer in this moment. You can feel anger and indignation because this is a violation. This is a theft. It feels like plagiarism. That sort of sense. If you’d asked me whether you could train on my stuff, I probably would have said no, but at least you didn’t even ask me.

Craig: It’s not yours.

John: It’s not mine. In some cases, some writers, it is their stuff.

Craig: That is a different deal. Yes, that’s a different deal.

John: I think writers feel threatened that this thing could replace them, and also powerless, which is what you’re describing there. It’s a sense that we have no agency in this fight.

Craig: We don’t.

John: We don’t. I want to propose a thought experiment. Let’s say that you’re one of these writers who’s feeling all these feelings, but you were able to peer inside the LLM and say like, “Oh, wow, actually, none of my work was used to train this.” If you actually realized like, “Oh, none of my stuff is there.” In the case of this most recent thing, anything written after 2018 isn’t in there. Does it really change how you feel?

Craig: No.

John: It doesn’t. That’s why I think “They’re training the model based on my stuff” isn’t necessarily as big a thing to be focused on.

Craig: It’s not an objection over an individual violation. It’s an objection over how our vocation is being viewed, treated, and used. If they can do it to you, that means they can do it to me, so there’s a little bit of a selfish concern in there. Mostly, it just feels wrong and unfair, and I suspect we’re all looking at each other the way that welders did in Detroit right before the robots wheeled in. What can you do though? This is one area where I think we have to all look at each other and realize that we are collectively complicit in creating the marketplace. We want to blame corporations.

I can say, yes, corporations don’t have qualms. They have no problem sitting there and injecting thousands of chemicals into something to create the Lunchable, which is– I’m obsessed with Lunchables because I love the name.

John: I’ve never had a Lunchable in my life. I know what they are, but I’ve never eaten one.

Craig: It’s terrifying. But here’s the thing, people like Lunchables. If they didn’t, then Lunchables would have failed. The corporations are venal and greedy and have no morals, but it’s only in pursuit of giving us what we seem to want. Now, the consumer base, a lot of times, is not aware of what they want because there are things they don’t know they want.

John: Absolutely.

Craig: There are things that haven’t existed yet that they were just unaware of, and then suddenly, boop, there they are, and then everybody goes crazy over them. This is an us problem. We like cheap things. We like cheap things, and we like things fast, and we like variety.

John: We’d rather have sugar than a difficult-to-digest thing. They are wired for that, and so I think sometimes this stuff that comes out of AI does feel like sugar. It’s like it solves this immediate hunger really quickly.

Craig: We play D&D every week. We typically will have Doritos. Cool Ranch Doritos…

John: Incredible. What an achievement.

Craig: That team of scientists should get a Nobel Prize and also probably be put to death for what they have done. That flavor powder is astonishing to this day, and it’s been decades now, but I still remember when that blue bag came out, and I was like, “Oh, what’s the new thing?”

John: Craig, you and I are old enough that we grew up at a time when ranch dressing became a thing.

Craig: Yes. Ranch dressing was the proprietary dressing of Hidden Valley Ranch, an actual ranch.

John: Yes, so amazing. Incredible.

Craig: I know.

John: All right, let’s talk legally and philosophically this moment that we’re at. Legally, the copyright questions are still TBD, so it’s unclear whether it’s fair use to ingest this material. I would separate the ingesting of material versus outputting stuff that was based on material. We don’t know whether the material generated by LLMs can be copyrighted. Right now, no-ish, but it really becomes a question of, well, how much of that was outputted from this model, if that’s tough. There are going to be situations like the music examples before, which are just so blatant that, well, of course, that’s a violation, but other stuff could be more subtle.

The question that legally, whether this is unfair competition, restraint of trade, that’s a live ball. The FTC and the new administration, I don’t see them tackling this.

Craig: Any administration, it doesn’t matter, they’re not going to move fast enough. Every week, this changes, and the gears of federal justice are glacial. The legal venue that may make a difference, if any venue will make a difference, is Europe.

John: Agreed.

Craig: Now, Europe, they’re pretty severe about data protection. They’re pretty severe about advertising online and representations, truth and so forth, and clarity, misinformation, and I could certainly see them getting pretty deep in on this and pretty quickly. If you are Google, you don’t want to just not be able to be in Europe. That’s a problem. That’s a problem for all these guys. So that becomes an issue, but here’s the thing, Europeans like stuff too.

John: Also, I think we have this sort of understandable big corporate Western bias, but the same technologies that made OpenAI, or made Cloud, or made Google, can be done in China, can be done in other markets, and they exist free. There’s other models out there. The genie’s out of the bottle. It’s going to be there.

Craig: The only thing that’s centered on us in the West is that we are making a lot of content for the globe. It’s one of the few things that America makes that is devoured internationally on a large scale. Obviously, there are huge entertainment markets overseas, like in India and China, but if you compare, for instance, how many movies or television shows come out of Europe as opposed to the United States, it’s probably not even close. Yes, it is a thing. I don’t honestly know where it’s going to go. All I know is that we’re going to yell and scream about it a lot while we are conveyed towards our destiny.

Just imagine all of us on a moving platform yelling about it and debating what we should do and where we should go, and the platform just keeps moving towards its final destination.

John: One of the other big challenges legally is you think about, oh, there should be a court fight. Who is the injured party? Is the injured party the original writer? Is it the copyright holder? Is it society as a whole?

Craig: No. The society as a whole has no standing.

John: What is the proper court to even be deciding this in? We obviously think about US laws.

Craig: It would be almost certainly federal because that’s where copyright law is. The companies that own the IP, that’s what intellectual property law is designed to do.

John: Again, if they tried to go after that this was used– the ingesting portion of the phase, I think they’re not going to win. They have to be able to show the output phase as being the problem.

Craig: Which they would, but the amount of time it takes to do all that– Again, while you’re doing all of it, it just keeps going. Then the threat of a settlement keeps growing and growing. Who are you suing? Are you suing Google?

John: Yes.

Craig: Well, if you’re suing Google, that’s fine. Let’s say you’re Disney and you’re suing Google. At what point does it become easier for Google to just buy Disney? Where do we think Apple’s priorities are? Their handful of shows or their massive tech business? You can see the writing on the wall here.

John: Let’s move aside from legally and think philosophically and morally. Is it legal to scrape the internet? Is it philosophically moral to scrape the internet? Because, really, Google did this to create Google. Google searched everything. It’s impossible to actually Google the answer to, “Was it a controversy when Google scraped the internet?” Because I’m sure there were people who were freaking out about that because they’re like, “Wait, you’re reading my stuff and processing it and serving it up.” It’s not the same thing, but it’s analogous to the same thing.

Craig: Well, they were crawling and collecting, but they were really just collecting links. “Here’s a link to a page.” Then they were seeing how many other people linked to that page. That was their big page link. That was their big–

John: Well, they had to know what was on the page and do a bunch of sorting on that page to figure out like, what is this page really talking about?

Craig: Right. I don’t know if that was considered controversial at the time. I think everybody was just thrilled that Search worked. Of course, people that were making content on the internet, businesses in particular, were so excited that there was a way for somebody to find it.

John: Yes, because it was useful.

Craig: Yes. When you put stuff on a webpage, then how did you get people to go there? By giving them this endless long link that started H-T-T-P.

John: Or getting Yahoo to put it in the big category. The big-

Craig: Right. The list.

John: -catalog of everything, yes. A list of everything.

Craig: Yes. The phone book, right? I don’t know if anybody complained then. Is reading everything on the internet or handing it over to something, no, it’s perfectly fine. To me, that’s no more illegal than reading a book.

John: I think philosophically, “reading” and “copying”, how we feel about them really depends on where we’re sitting because I think the AI technologists will say, it’s reading.

Craig: It’s reading.

John: It’s reading. It’s reading a thing. It was like, “Oh, you’re making an illegal copy.” Every webpage you’ve ever visited is a copy of that webpage. You’re not actually pulling the original webpage.

Craig: Correct. You don’t make anything until you make something. If you said to people, “Listen, I’m building a large language model and I’m going to have it read everything you ever wrote, but it’s never going to write anything itself. It’s just reading because it likes to. If you want to come over and talk to it, you can, but it’s not going to write anything,” who would have a problem with this?

John: Some people would have a problem, but most people would not have a problem with it. Interesting counterexample here is Google Book Search. Google scanned hundreds of thousands, millions of books, and then it would show you a little excerpt from that book. Authors argued like, “It is taking away the value of my book because people can find what they want on that little book search and not actually have to get the book itself.”

Craig: I’m sure the book publishers would disagree and say, “Oh, no, no. No one was finding your book. Nobody was buying your book.” Now, 80 people bought it because Google Book Search led them there. Again, copyright’s a different situation there for novelists. For us, we are at the whims and mercies of the companies for whom we work, and they are either, in various levels, identical to tech because they are those companies, in bed with them or floating out on their own. The ones who are floating out on their own, I think, are the ones that are terrified right now, and probably looking for a tech buddy to join up with.

John: Yes. I’m hoping we still have some listeners who are still outraged. Who feel like this is outrageous and something has to be done because I would then prompt three questions: What do you want to see done, who do you want to see do it, and would the strategy be effective?

So, what do you want done? Do you want to shut down any model that’s been trained on this data? Do you want to compensate the writers whose work was included? Do you want to ban the future use of training off this or similar materials? Those are things you could ask for. You’re shaking your head. I don’t think they’re achievable.

Craig: No, they’re not achievable, nor would they even be enough because technology is just going to get around that. It’s like water. It’s going to figure out how to get where it needs to go, even if it has to carve a canyon through rock. Oh, we didn’t train it on your stuff. We trained it on this stuff that was trained on your stuff by somebody else who’s out of business now. That was free leave. There are so many ways for these companies to engage in f-ery. That’s F-dash-ery. I think we’re just kidding ourselves.

John: Yes. Honestly, I feel the same way I feel about the pandemic, which is that I feel some people who are so outraged and angry, it’s like, well, they want a time machine, and there’s just not a time machine. I can’t take you back to a time before the pandemic. I’m sorry you might’ve voted for this person because you believe it’s somehow going to take you back to 2019, but it won’t, and we’re still here, yes.

Craig: Yes. Now more than ever, I think it’s important to engage in the Serenity Prayer when we can.

John: “Worry about the things I can control,” to paraphrase.

Craig: Yes.

John: What’s in our control?

Craig: In this instance? The only thing, as far as I can tell, that is in our control as writers is whether or not we assign copyright to another company of original material that we’ve created. That’s it. That’s the only thing in our control, and that has always been the only thing in our control. Even as a union, that stuff, that collective bargaining, it’s also not really in our control.

John: No. I get frustrated because Kim Masters on this last episode of The Business was saying like, “I got to believe that the WGA should do something.”

Craig: Oh.

John: Kim–

Craig: I love her. She’s smart and everything, but the WGA is not going to be able to do anything here.

John: First off here, everything that could have been done, we did, and we did first. Writers are human beings, material generated by LLMs is not literary material. Writers cannot be forced to use LLMs. We are negotiating a contract with our employers. As far as our employer relationship, I think we’ve done everything we can. We should defend what we’ve done and make sure we don’t lose those protections.

Craig: We can expand it as maybe some f-ery occurs, but the WGA isn’t Batman, right?

John: No.

Craig: All they can do is control that contract. If the companies arrive at a place where they can create literary material that is of the same quality or, God help us, better than the stuff that we make as humans, there is no more WGA. It doesn’t matter. What are we supposed to do? Just argue over a contract that employs nobody because they’ve got the robots doing it? I just think when somebody says the WGA has to do something, they’re almost setting up someone to blame.

John: That’s really what I do feel like because it’s like, listen, the strike was not about this, but it was partially about this. I testified before the Office of Copyright and for the FTC. Our president testified before Congress. Do you want us to enter a giant lawsuit against somebody? That’s going to waste a bunch of money.

Craig: It’s not going to work. While we’re doing all of that, what will be is what will be. We don’t like these things, but if the rest of the world does, we lose the vote, and the market votes with its money.

John: I want to make sure we’re focusing on what things we can control. As a writer, you have the choice of what technologies you’re going to use and what technologies you’re not going to use. You can be smart about those things. It’s also, I think, good to make a set of policies for yourself and stick to those policies. If you’re never going to touch one of these systems, God bless you, stick with that and make a plan for that.

We should continue to fight for the protections that we already have. We need to keep ourselves educated about these things and defend the idea that art should be created by human beings is a noble thing to keep fighting for. Set professional standards for ourselves and others. I just think this is a dumb hill to die on. It’s just going to be a distraction from actual meaningful fights about the future of our labor.

Craig: The thing about hills to die on is you got to go have a chance to not die. This hill, this is Death Hill, right? It’s not that we don’t think it’s important enough to fight for, but there are things where you can just tell this toothpaste isn’t going back in the tube. In fact, we’re not even sure what’s about to come out of the tube. We have no idea. All we know is it keeps coming day by day. What’s going to happen is we’re going to take our stands and we’re going to be angry and we’re going to say our things. Then somebody that we really know and like is going to be like, “By the way, I just had this incredible interaction with AI and did this thing and it’s great. It actually is super, and culturally, just watch.”

What are you going to do? You’re going to just yell at cars all day long because you really loved horses?

John: No.

Craig: It’s not going to work. When it comes to protecting artists, I’m afraid that in our line of work, not painting or songwriting, but in our line of work – television and film – we are subject to the vicissitudes of our employers and their varying interests in whether or not they want to defend their own intellectual property. That’s what we got.

John: Yes. I think if you were to take all of our work out of the models, everything that a WJ writer has ever written, pull it out of the models and permanently ban it from all the models, the models would be slightly worse. A slightly worse AI would still eat your job.

Craig: Yes. Maybe they would just get to where they were going to get a little bit later.

John: A month.

Craig: That’s the part that’s really upsetting. This has been something that has happened throughout history. Typesetters must have been really pissed when word processing came along and just automated [crosstalk].

John: Yes. Automated that whole thing.

Craig: This is what happens.

John: Elevator operators.

Craig: Ah. Which is why I love New York, because there’s still like, you know what? Every now and again you walk in an elevator, there’s a guy. Hopefully we’ll make it. I don’t really think there is an example in history of anything like this.

John: Yes, it’s different.

Craig: This is different, which is terrifying. What is also terrifying is how blithe everybody is as they run around and run toward it, and yet everybody seems to understand that it’s happening. Mostly people seem to be shouting at each other about it. Which, if I were a conspiracy theorist and thought that AI was trying to take over the world, I would suggest that AI had been doing a brilliant job of turning itself into the distraction that we all yelled about while it quietly ate our lunchables.

John: Let’s answer some listener questions. First, we have one from Jonatan about finishing work.

Drew: Yes. Jonatan says, “Do you think that every screenplay should be finished no matter what? If you’re working on a script and realize that it’s not good enough to become a movie, is it better to finish every script regardless so that you make a habit of actually finishing your stories and not normalizing quitting, or is it better to drop a story when you realize it’s not good enough?”

Craig: Normalizing quitting?

John: Normalizing quitting.

Craig: I love the kids. I think that if you are early on, this is your first or second script, yes, get to the end.

John: Get to the end, yes.

Craig: Finish it, know what that means, even if you see by the time you finish it why it was not meant to be finished. If you’ve got a couple behind you, if you ever finished any screenplay and you’re writing a script, and you’re like, “Oh no,” yes, normalizing quitting is just not working. Ball it up and– think of it as a really, really aggressive rewrite, where you’re rewriting it to something else entirely.

John: I think it’s important to finish a script. Craig and I have our feature bias. We were thinking about a 120-page script, which is a long thing. Listen, that could be months more of work. I don’t want you to kill yourself over something that saps all your will to live to finish this thing if you think it was a bad idea, it’s a fundamentally flawed premise.

But it’s also important to realize that writing is just hard. At a certain point in a script, everyone goes through that crisis of faith in a project. It’s like, “I don’t know how to do this thing. It’s the worst idea. I should never have pursued it.”

Craig: Yes. That’s why I think if you have one finished, then at least what it means.

John: You know what it feels like. You know what place. On the second script, on the third script, you’re like, “Oh yes, I recognize this feeling. It’s not the end of the world.”

Craig: I think default to finishing, but it’s not quitting. It’s making an executive decision about your artwork.

John: Yes. Let’s answer one more question. This is from Brett, who’s had his first contract.

Drew: Brett writes, “I’ve been ‘hired’ to write my first assignment. First, thanks so much. All along the way, as producers argue and the director gives notes, your voices have been echoing in my brain reminding me that my job is to make everyone feel heard and respected, while ultimately protecting the movie. Quick preface, I work in music, and I know this director from music video shoots where we’ve crossed paths in the past. Here’s the question. This is a non-union gig. The budget is $10 million. There is IP from a well-known song and participation from a well-known musician. Because it’s non-union, the producers have basically put the impetus on me to define my financial terms.

I’m not cash-strapped, so I’ve been creating literary material without any agreement, but it’s time for me to start the screenplay, and they have asked me again about pay. I would like to enjoy in the back-end success via residuals, but I assume that’s impossible in a non-union production. Could I or should I ask for a tiny percentage of the sale? Otherwise, would you recommend asking for some amount due upon delivery of the first draft? Maybe a weekly rate for the rewrites and polishes?”

John: A $10 million movie is not tiny, and it feels like this could be a WJ movie if they chose to make a WJ movie. It’s like it’s really easy to spin up an LLC, but they’re not going to do it, so not a lot worth having. A $10 million movie, you should be getting terms that are like what you’d be getting for the WJ film. What I would say is go on the WJ website, pull up the most recent contract, and figure out what are the prices for a draft, for a set and revisions, and work off of that as your template. That should be the floor you’re thinking about rather than starting from scratch.

In terms of back-end, they may not know what they’re doing either, so there might be some definition of something that is actually meaningful. Regardless, you’re going to want to have an entertainment attorney take a look at this to make sure you’re assigning something that’s just not dumb.

Craig: I think probably an entertainment attorney here would also be helpful to provide context. Because if they are reputable and they work at a firm, this is not the first time that the circumstances are risen. They can say, here’s other movies that roughly cost $10 million that were non-union deals with non-signatories. This is generally what we try and do. We try and capture X percentage of the budget for the writer, which is very typical.

John: Back in the day when we were doing budgets, and Drew, correct me if this is wrong, because you’ve done this more recently, 1.5% is what it is.

Craig: 1.5%, okay.

John: Drew, is that familiar to you at all?

Drew: That sounds right, yes. We tend not to do back-end anymore. Everyone is pushing more towards Box Office Bonus.

Craig: And back-end would be a trap with a company like this because the worst possible news is, yes, we grant you all of your back-end requests that, as worded, will never equal money. So a buyout could be possible.

John: A production bonus would make a lot of sense.

Craig: Production bonus. Also, is this going to be a negative pickup for a distribution company? Part of that fee. Do we get a percentage of that sale, as defined by what? You needed a lawyer. You need a lawyer real bad. The WGA minimums would be where I would start, and a lawyer will help you with this. There’s no way around that. We’re not lawyers.

John: No, so we can only point you in directions of things you’ll talk to your lawyer about.

Craig: Yes. Like this.

John: Yes, like this. Money.

Craig: If you’re going to ask a question about contracts, nine times out of 10, we’re going to be like, “You’re going to need to check with a lawyer.”

John: Yes. I wouldn’t say ChatGPT would be your friend here.

Craig: No.

John: No. They’ve not had the on-the-ground experience with this kind of contracts.

Craig: You could hire an AI lawyer and you go to real jail.

John: Great. It’s time for one cool things. My one cool thing is this video I watched a couple weeks ago. This is Jon Batiste hearing this Green Day song for the first time. Jon Batiste is an incredibly good composer, singer, songwriter. Just brilliant at the piano, has sort of Stevie Wonder energy, and just basically sort of can rips on anything. In this video, they have him with headphones on and he’s sitting at the piano. He’s hearing this Green Day song for the first time. He has no idea what the song is, and he’s not told it’s Green Day.

Craig: Oh, I’ve seen this. It’s great.

John: Yes, it’s great. He’s just hearing the vocals and drum track, and he’s just at the piano figuring out what the music is that goes with it, and it’s just– off the top of his head it’s brilliant. Just to see this–

Craig: Interesting.

John: Interesting. Co different but completely interesting. Craig and I both had the experience of being able to work with really talented composers who could just do anything. Suddenly, things that are–

Craig: It’s magic.

John: Yes, it is genuinely magic. He is just a magician. Seeing what he’s doing, whilst also just seeing the joy he’s feeling in the moment, and then actually hearing the full track versus what he did, it’s incredibly good. If you just want to see the value of actual human beings in creation of art, I can think of no better example than Jon Batiste listening to Green Day. We’ll put a link in the show notes to YouTube.

Craig: I also, my one cool thing derives from a video. I, like millions of people around the world, opted to make the viral Mac and cheese for Thanksgiving. This is Tini. I think it’s pronounced Tini? Tini, T-I-N-I?

John: Yes.

Craig: I should know this. Anyway, she had a video, it was on TikTok, where she makes Mac and cheese. For some reason – and even she is like, “I don’t understand why” – it became the sensation, and everybody felt a strong need to try and make this Mac and cheese.

John: What’s different about this approach?

Craig: Honestly, I just think it’s a solid approach. She recommended cavatappi pasta, which is much better than an elbow macaroni. Shredding your own cheese-

John: For sure.

Craig: -because pre-shredded cheese has starch on [crosstalk].

John: Now, she’s making a béchamel sauce and melting the cheese into it.

Craig: She’s making a roux-

John: That’s classic.

Craig: -which turns into a béchamel. It was nice also watching it because I cook a lot, so it was cool to think, “Oh, a lot of people are now learning what a roux is, which is cool.” Some interesting flavors in there. Smoked paprika and a little bit of Dijon. Anyway, I made it.

John: Was it good?

Craig: Outstanding.

John: Oh, it’s great to hear.

Craig: Like 11 out of 10 would make again. Really, really good.

John: Breadcrumbs on the top?

Craig: No.

John: Oh, okay.

Craig: No, no breadcrumbs. In fact, she was very, very adamant. Like, “No. Get your effing breadcrumbs away from my Mac and cheese.” No. At the very end, you just put it under the broiler for like two minutes just to crisp it up. That’s it. It’s intense. It’s a heavy dish. It’s not an everyday food.

John: What’s so fascinating about Mac and cheese is that there’s two separate categories of things. There’s the Mac and cheese you’re describing, and then there’s just Kraft. Kids who love Kraft, and you try to give them your Mac and cheese, they would throw a fit.

Craig: Kraft, as we have mentioned earlier, is a corporation that spent so much money coming up with that orange powder, which is awesome, by the way.

John: It’s also great, yes.

Craig: A Kraft Mac and Cheese is delicious. I resent it for being that delicious, but also, when you look at the effort, I will say, Tini’s Mac and cheese-

John: It’s a lot of work.

Craig: -it took a while. Just a little elbow grease getting all that cheese shredded there. Yes, I thought it was great. Tip of the hat to her.

John: Awesome.

Craig: She did a nice job.

John: We’ll put a link in the show notes to that. That’s our show for this week. Scripted and produced by Drew Marquardt and edited by Matt Chilelli. Our outro this week is by Nick Moore. If you have an outro, you can send us a link to ask@johnaugust.com. That is also the place where you can send questions like the ones we answered today. You’ll find the transcripts at johnaugust.com, along with the sign-up for our weekly newsletter called Interesting, which has lots of links to things about writing. We have T-shirts and hoodies. They’re great. You’ll find them at Cotton Bureau.

You can find the show notes with the links for all the things we talked about today in the email you get each week now as a premium subscriber. That’s new. We thank all our premium subscribers. You make it possible for Craig and I to do this show every week, along with Drew and Matthew. You can become a premium member at scriptnotes.net, where you get all the back episodes and bonus segments, like the one we’re about to record on the difference between living in a simulation versus living with a creator, or if there even is a difference. Is there a conundrum? Is there a paradox there?

Craig: Let’s dive in.

John: We’re going to dive in. Only for our premium members. Thank you to those folks. Drew, thank you for a fun show. Craig, thank you.

Craig: Thank you.

Drew: Thanks, guys.

[Bonus Segment]

John: All right. Drew, to start us off here. Read this email from Tim.

Drew: Yes. We got a follow-up from Tim, who writes, “In Episode 665, Craig’s one cool thing was the WIRED article about scientists reimagining the underpinnings of reality and discovering new depths of its elegant simplicity. He commented that simplicity makes sense since reality is a simulation. It made me curious. How would Craig make a distinction between the cosmic classifications of simulation versus creation? Both imply a closed system with intentional design and a first cause. Is it that simulation is usually associated with natural designers, while creation is often linked to the divine?

What, if any, distinction would Craig make between the type of designers who lay behind either model, and why does he prefer the simulation metanarrative?”

Craig: What a good question. I enjoy this. Okay. There are almost no differences. Really what it comes down to is that the idea of divine creation ascribes a sense of moral order to the universe and purpose. This is the most important thing, purpose, whereas the pure simulation way of thinking about things implies no moral order whatsoever, and very specifically, for me, implies no significant purpose.

If, say, we launched The Sims, and we had gotten to a place where The Sims was so good that all the little individual Sims were actually fully conscious, would we be able to explain the purpose to them? The purpose is to what, amuse me? I guess that’s a purpose, but it’s not a divine purpose. It’s not spiritually significant. I suspect that the simulation that we live in is not spiritually significant, and I don’t think that there is a moral order that is implied by somebody. Oh, absolutely, it could be a person. It could be one person. We could be the work of one-

John: One consciousness.

Craig: -one consciousness, one entity that has coded this and is running it, or we could be the product of 2,000 simulations deep. I don’t know, nor could we know. But, of interest, I did read an article – I’ll have to find the link to it – where people were arguing about the Big Bang, and what they’re struggling with is they can’t get around it. It happened. They don’t know why. And every time they try and beat it, they can’t.

John: They try to get around it scientifically or philosophically.

Craig: Scientifically. They’re trying to say, “Look, surely there’s something other than an unmotivated explosion.”

John: It feels like division by zero. It’s undefined, yes.

Craig: It just seems like. Really, what I think we’re struggling with is that somebody turned it on. The program was launched, that’s the Big Bang, and we can’t handle it.

John: Actually, I want to dig into what you’re thinking. Do you believe that the simulation began with a Big Bang or do you believe that it started at some other point and a narrative was installed, and basically, retroactively it sort of filled in the space behind there as an explanation force of?

Craig: Either one could be true. It’s either that the simulation was running along, and then someone went, restart it, but start it with this, and let’s– I suspect that it’s really more that the actual initiation of the simulation appears to us through our primitive physics as a large explosion in which everything, information, was contained. The Big Bang Theory says there was one little tiny, infinitely small dot that contained everything that we see. The gazillions of things. I don’t know how much mass we suspect the universe has. All of it was there in that tiny little dot, and then it exploded outwards. I think maybe it just turned on. Seems like it turned on.

John: Yes. Expansion versus creation.

Craig: I think it was the code began to run.

John: I should say before I forget to say that if this is an intriguing conversation for anybody or this resonates, my movie, The Nines, is actually about this.

Craig: Yes. Go see the movie.

John: Go see the movie. I want to dig in a little bit more here, because when I think about– I would consider myself an atheist, or at least I don’t believe that there’s an act of God who cares. I think, like you, I’m fine with the idea that there is a creator, the first cause, the first mover of things. I remember taking a philosophy or religion class in college, and we went all through ontology and teleology and all the proofs for the existence of God. What I was being so frustrated by is like, “Well, even if philosophically I’m willing to say like, okay, sure, it doesn’t get me to like the Christian Abrahamic God at all.” There’s no tie in there that makes any sense to me.

Again, the idea that someone flipped a switch, sure, but that doesn’t actually get me to Jesus died for my sins.

Craig: Correct. Nor would it ever. The history of philosophy is riddled with otherwise brilliant people bending themselves into absurd pretzels. Descartes in particular. What the hell? Come on. “I think, therefore I am.” What was underpinning “I think, therefore I am” was I think, therefore I am. If there is an I, that means that God must have made me.

John: Yes. The I is important.

Craig: It’s so topological.

John: I think, therefore I am. Yes to all that conversation. My question though is these philosophers who were tying themselves in knots to then say, “Oh, but this proves the divinity of this and the thing.” Was it because they actually believed it or because they needed to contort their statements in order to fit the culture in which they were living for their own safety? I was just reading through Seneca’s tragedies, and Seneca, the younger, I didn’t realize was actually like Nero’s tutor. He’s writing these brilliant examinations of power and government, but he writes about the ancient Greeks.

They weren’t that ancient at that time, but he was writing about the Greeks. That they had plausible divine ability. He’s not actually writing about what he’s seeing around him.

Craig: I think once we get into, let’s say, out of the Middle Ages, and even from some of the people in the Middle Ages, it is a question of how demonstrative and vigorous they are in their pursuit of this proof of God. Some philosophers just really– Kant really believes in God. Clearly, he’s not trying to get at anything.

John: The question is– and again, I could read the books, but I haven’t read the books. Do we say that Kant believes in God in his heart and therefore, that’s informing how he’s putting his thoughts together, or does he intellectually deeply believe in this Christian God that he’s writing about?

Craig: It’s intertwined. I think what happens is there are some things that you just need to believe. You need to believe them. Kant is so profoundly smart and boring. He’s one of the most boring writers ever, but incredibly smart. It’s clear that there is a presupposed notion, which is ironic, because that’s this whole category of knowledge that he invents. This idea that there are some things that are provably true, that existed before we proved them, nonsense. However, he needed that to be there because it also explained part of how his own mind worked.

I think that some people grow up in a way where they just– they have to deal with the fact that this must be true. Proving God’s existence seems like utter folly to me. The whole point is you can’t. Isn’t that the point of faith? I’m like you, it doesn’t bother me. I’m so atheistic that I don’t even get bothered by religious people. I’m like, “Sure. For sure.”

John: Sure.

Craig: I’m fine, I’m over here.

John: This notion that there is a creator, and that creator is therefore watching us or is somehow involved, always felt like a giant leap to me. Because we’ve all seen systems that just keep running forever. Someone starts them and then they walk away and they keep doing it and they might spawn other things. Stuff is just happening in the background, and it doesn’t necessarily mean that there’s, again, a plan, a moral directive for how these things are supposed to be working. That creator might have set the initial conditions that creates the fundamental laws of physics and how the universe functions. Maybe there are moral laws underneath stuff but lack of evidence that they are enforced.

Craig: Lack of evidence. We also don’t understand how time functions for– let’s call this person the mover or observer. They’re running a cycle of a thing for some reason, or a thing of a thing of a thing is running a cycle of a thing. Maybe even this is some AI trying to learn something, who knows? Our billions of years of existence and our personal tens of years of existence could be gone in a nanosecond.

John: We’re just a training model. We’re being used to train some other model.

Craig: We might be. What I find interesting is how as years have gone on from the beginning of history, which is recorded history, early on, generational narcissism, people were just starting to observe themselves. Therefore the idea of a God that was watching us all, evaluating, judging one by one, and then assigning to a fate made some sense. Yes, Osiris and Anubis are going to be here and weigh your heart against a feather and blah blah blah. Okay, but it’s been thousands of years. The world is ridiculously complicated.

The idea that there is a God watching all of this down to every individual person, to me paints the picture of an enormous dullard. Somebody who’s so dull they’re incapable of being bored. Because I can’t imagine anything more boring than watching every single person, every single second of every single day forever to sort them into bins, for what? That sounds like a dullard.

John: Yes. It’s actually worth their time to be evaluating, “How did this one do?”

Craig: The most powerful being conceivable is just down to sorting.

John: Unless it’s like reinforcement learning, basically. It’s like, “I’m going to set up all these different things and see which one of these models learns to walk the best,” or do something else. Maybe that’s what it is.

Craig: We’re back to simulation.

John: We’re back to simulation.

Craig: The idea of like this isn’t a simulation, this is somehow metaphysically real, and there is somebody watching. I’m watching. I’m listening to you. I hear everything, see everything. What a terrible way to spend your day if you could do anything.

John: What a great question from Tim.

Craig: Thank you, Tim.

John: Tim, thanks for your great question.

Craig: I called God a dullard.

Links:

Email us at ask@johnaugust.com

You can download the episode here.The post Scriptnotes, Episode 669: They Ate Our Scripts, Transcript first appeared on John August.