Will AI Become Friend or Foe to Animators?

Pixar's Elemental
PIXAR

As the animation community deals with industry contraction, one force inspires more fear and uncertainty than any other: Artificial intelligence. The Annecy Festival will screen four works using AI after receiving dozens of submissions that used the technology.

Whether it’s concern that AI will make some animation jobs obsolete or replace humans altogether, there’s discussion about what it will mean not only for artistry but also workflow. Some see AI as a tool to be mastered and carefully applied as issues surrounding copyright, creation and overall use are sorted out. With all the questions AI raises, they view it as just another technological evolution. As one generation of artists had to contend with Photoshop and what it meant for digital imaging, this wave of artists will have to learn how to use AI.

“At the end of the day, computers are expensive pencils,” says Cathal Gaffney, managing director of Dublin-based Brown Bag Films (“Dylan’s Playtime Adventures”). “They are a creative tool to assist artists to realize their vision. It enables the artists, producers and directors to create things. The people that are going to be out of work because of AI are the ones who don’t know how to use AI to help them do their jobs better. AI is a productivity tool. It speeds up certain tasks. The industry is struggling right now for all sorts of reasons. AI might be the perfect storm that people are worried about, but I think colleges have a duty to start teaching their students the benefits of AI in terms of speeding up some tasks.”

In March, at Ireland’s cartoon festival, Animation Dingle, Gaffney participated in a panel dubbed the “Ascendance of AI in the Creative Realm.” It was presented in the hopes of giving students who aspire to work in animation an opportunity to ask questions from industry leaders about how AI will affect their prospects.

“Everything is programmed for students at the festival,” says Maurice Galway, Animation Dingle’s director and co-founder. “There are a lot of young people I’ve heard from, and they don’t like the idea of [AI], but it’s not that it’s going to change. We’re not going back, so people will likely have to learn to use it as part of their work.”

AI Tools like Stable Diffusion (used to generate images with text prompts) and Midjourney (often used for rapid prototyping) have upended the expectations for what’s possible through AI, but they’re still not as far along in development as some might want or others might worry about. The market is crowded with other image generators such as DALL-E, ImageFX and NightCafe, just to name a few. More AI-based tools like these are constantly being released to the market. But human animators and creatives are still the primary sources of ideas and will be for some time, according to Delphine Doreau, program director for animation at Pulse College in Dublin. Animators will likely need to know how to use the tools to keep their skill set current, though.

“AI uses data to pop things out,” says Doreau. “There is no futuristic vision. There’s nothing investigative, prospective or creative. So, it’s limited. Artists won’t be replaced because it’s not possible as long as we want to do innovative things, which is less evident in today’s market. So, the risks are either a stagnation like in medieval times when innovation was dangerous, or we will use AI as a tool so we can work a little bit faster.”

AI also brings up thorny issues surrounding copyright of intellectual property. Often story, characters and overall animation spend years in development between preproduction and the iterative production process when scenes are repeatedly screened, workshopped and rewritten.

If the original creators feed their materials into AI to teach it to make things like backgrounds so animators can eliminate having to repeat certain tasks during production, that’s no different than the makers of something like the old Tom and Jerry cartoons repeating a background as the mouse tries to escape the cat. However, when a company or individual outside of those creators aim to use materials they don’t own in conjunction with AI, that becomes something very different.

Drew Mullin, executive in charge of production for CBC Kids in Canada, believes broadcasters are concerned about AI scraping data from non-licensed sources since it can bring up copyright issues. It’s an opportunity for producers and artists to use the tools to cut down on production time, but carefully controlling what materials will be used to train AI for certain tasks will continue to be challenging. For now, they’re working out how they will respond.

“These types of machine learning tools, you know, have been around for a while, and [much] of the software that the teams we work with have had these things incorporated into them,” says Mullin. “It’s something that I marvel at, but first and foremost, I think we have to protect artists and protect the filmmakers and producers that we work with. Being part of CBC Kids, we’re a public broadcaster. We’re just approaching the whole thing very cautiously, and we’re looking at everything on a case-by-case basis because this technology is not going anywhere. How [AI] is incorporated and how it’s used is very important. I think every broadcaster in the world right now is looking at this and looking at implementing policies.”

Gaffney believes production companies have an obligation to discuss how they will and will not use AI in their work. He plans to post his company’s policy on its website in the coming weeks. This will affect the kinds of AI skills he wants animators at Brown Bag to have and how they will work with the technology. The policy will change over time, depending on how the tools evolve and what uses artists find for them in their work.

“I think every animation studio needs to have an [AI] policy just like they have an environmental policy,” says Gaffney. “It’s important for both clients and staff to understand the company’s approach. Before this, there was incremental change, but this is a massive disruption.”

Gaffney notes that something like Midjourney is “generative AI that has been trained on everybody else’s copyright,” from Pixar to Disney to Brown Bag Films.

“So, anybody around the world can say, ‘Give me a character design in the style of Doc McStuffins or any Pixar character.’ It’s been trained on a lot of people’s copyrights. So, you’re getting a computer to, basically, mimic other people’s copyright now.”

That’s an ethical issue. “If you’re in the business of creating copyright, you also have to be in the business of respecting other people’s copyright,” says Gaffney. “And I think a lot of the generative AI that’s trained on older [artists’ and studios’] copyright is no different than a studio using pirated software. Automation isn’t going away anytime soon. How we use it is going to change, but we shouldn’t cannibalize ourselves and disrespect other people’s copyright.

“There’s certain kind of boundaries that you just don’t touch.”

From Variety US

int(14726)