Owlchemy Labs CEO Andrew Eiche thinks devs should be bullish about AI
"These tools exist, and they aren't disappearing anytime soon. The choice is to figure how to make them work for you, to figure out where the workloads actually fit best, and to build them in an ethical way."
At a Glance
- Developers are still seeking answers to the question "how can generative AI be used in games?"
- AI's high energy usage and dubious data sourcing ethics has many developers wary to adopt it.
- But Owlchemy Labs CEO Andrew Eiche thinks developers should be excited about the future use of AI tools.
First things first—Owlchemy Labs currently isn't using generative AI in game development.
In a sit-down chat with Owlchemy Labs 'chief executive owl' Andrew Eiche at DICE 2024, the virtual reality veteran shared that fact before we dove into a broader chat about his enthusiasm for AI in the video game industry. "But wait," you might ask. "Wouldn't someone excited about generative AI be using it as we speak?"
He probably would be—if the tools Eiche and his colleagues at the Google-owned studio needed were available. "We don't have the tools at our disposal that would be useful," he said, calling current AI technology "novel," but not ready for primetime.
His need to establish the studio isn't using AI derives from a broader point—the use of generative AI, for now, is officially controversial. Developers who reveal they're using it face backlash from professionals and players. That backlash may be motivated by different factors. Many are angry at how content-driven AI tools draw from internet-scraped datasets to mimic creative work. Others see the amount of e-waste associated with AI data centers and wonder how the technology can coexist with a need to fight climate change.
Others think a lot of the content just looks or sounds bad.
And yet nearly half developers surveyed by our colleagues at Game Developers Conference acknowledged they are currently using generative AI, and developers like Broken Sword creator Cecil Charles have made passionate arguments for how it's helping them accomplish tasks that would have otherwise been impossible.
Eiche casts himself in the same lot as Charles—as a company leader who understands why there's hesitancy over the tool, but still thinks developers should embrace what's coming down the pipe.
"It's probably gonna be pretty boring"
Wrangling any conversation about AI technology requires both parties to pin down exactly what they're talking about. The spectrum of "AI enthusiast" runs from "someone who thinks ChatGPT is a few hops away from what's referred to as "artificial general intelligence" (it's not) to "someone excited about how it can be used to auto-translate ASL or identify different types of cancer."
Eiche runs closer to the latter category. He joked that some programmers—particularly 'big application developers'—have thrown themselves into the former category because "it's the first time they've had joy back in their lives."
"They've been working in big things like Kubernetes...it boils down to [them] seeing text in a command window, and with AI they're like 'I'm not seeing text! I have a picture!' without maybe thinking they overcorrected."
He compared what we're currently living through with the early days of VR hype, when boosters pitched it as a technology that could "solve empathy" and other wild claims. "We're in that state right now with AI," he said. "The problem is that...the most exciting thing you can do with AI isn't necessary of value," he said, pointing to art-driven generative AI tools like MidJourney that create images that boast fidelity but will be riddled with graphical flaws.
"We've created the sweetest meme machine," he joked, when I raised the specter of the puffy-jacketed pope photo that went viral for a few days.
In his view what's missing from many generative AI tools is any ability to control for randomness—the true source of those graphical flaws. But that's a tough problem to solve, because the randomness is what's made MidJourney such a useful tool for making uncanny knockoff art as opposed to deliberate visual production. "We can generate a cool picture, but can you generate assets that actually work [in game production?]" he asked.
If AI-generated visual assets don't have any equivalent to photoshop layers or VFX plates that can be tweaked, developers will essentially be stuck mashing the "randomize" button to get the least-bad version of what they want to make.
Once those challenges are solved, he echoes Charles and other enthusiasts about how the tools will let them punch above their weight through what is effectively automation. "What we're automating is the thing that used to [require] a bunch of people because the process was the process," he said.
He said he sees this all as "actually very boring," and that it's the same thing that happened in factories over the course of the 20th century. "We look at the process and go 'here's where the human touch matters, can we automate away the parts that are not the human touch parts?'"
Eiche's framing of automation as technological inevitably in game development is interesting, as the architects of automation were pursuing increased profits by increasing production output, but its implementation became wrapped up in control over labor. Eiche nodded to the fact that increased automation in factories are part of what killed the idea of the factory town, but still said automated elements of game development are inevitable in the capitalist system we live in.
"Unless I'm advocating for the revolution, we're not going to move back from automation—we're going to move towards more automation because it saves companies money."
We pressed Eiche and asked "why not advocate for the revolution?" "We did make a game making fun of the concept of doing jobs in a post-job world," he responded. "Any statement you want about where we sit can be found in Job Simulator."
Image via Owlchemy Labs.
That did leave us with a question of what "automated" game development could look like or produce. Eiche was skeptical both of the idea that AI-wielding devs could make great games entirely by themselves or that increased game content would lead to better games. "Human authorship in games in games—moreso than other fields—is unreplicatable," he said. "We understand the concept of double entendre, and puns."
"You can build a level or mission that winks back at the player, and that level of subtlety is one a computer can't replicate. You can get into an arms race where you make a million missions...and you can make them better and better, but you're not going to teach a computer how to wink back at a human."
Job Simulator is indeed a game where Owlchemy Labs is constantly winking at the player, but the concept he was talking about applied to non-comedy games as well. We chatted about the fact that games like Prince of Persia: The Lost Crown or Titanfall 2 are great examples of games where developers can design around the idea of "I knew you were going to do that," spawning enemies or crafting puzzles that keep up with a hand-crafted progression curve.
How can developers better talk about AI?
As CEO of Owlchemy Labs, Eiche is in a unique spot to be talking about AI with the developers working for him. He can banter about what the technology can do for the company but he has to do so from a position of power. However friendly a place the company is, Eiche is the one with the last say on company policy and direction.
"We enjoy talking about it," he said. "There's definitely a bit of defensiveness and concern and rightfully so. A lot of the tools making a big splash are the ones that are attempting to replace [developers], not enhance their work," he said.
Him mentioning that defensiveness was worth immediately following up on. If Eiche is sincere about his AI excitement, what has he learned about trying to address the concerns of his colleagues? He said his goal has been to repeatedly signal Owlchemy is only taking up AI after doing "ethical and legal due diligence" (he's not excited about having a game get yanked off of a store for containing plagiarized content).
It also helps (shocker) to not talk about AI in vaguely apocalyptic terms. Artists are much more receptive to chats about AI when the person on the other side of the table isn't vocally trying to replace them.
He's also found it's helpful to be something of a wet blanket for colleagues most excited about building AI workflows. When they're overly gushing about how they could use an AI tool to create a given kind of art asset, he'll step in and say "in the amount of time you spent getting this to spit out what you want, wouldn't it just have been easier to talk to our artists?"
Eiche's "boring" approach to thinking about what can be made with AI tools is admirable but maybe spotlights who actually needs to hear a message about what generative AI is capable of. Eiche's been a developer so he and other developers have a clearer sense of what kinds of tools they'd need to make games faster and better.
It seems like if Eiche wants developers wary of the technology to come on over, it's the C-suite executives who really need to hear and internalize his message. AI, like VR, can't save the world. And if execs were more willing or able to speak with developers about what it could actually do, backlash to introducing the technology likely wouldn't be as harsh.
Game Developer and GDC are sibling organizations under Informa Tech
About the Author(s)
You May Also Like