Voice actors reportedly urged to sign voices away to AI companies
I don't know that voice.
Voice actors are becoming increasingly worried about the growing prevalence of AI in the voice acting industry. Speaking to Motherboard, several actors revealed they've been asked to sign their voice rights away so those voices can be used for AI software by other clients.
AI voices have gone viral on social media, and it's becoming a larger point of concern as the technology grows in popularity. Non-voice actors have gotten a kick out of making videos of voices saying things the original actor didn't, and others are already calling it the real future of voice acting.
Last year, AI became more prevalent in game-adjacent industries such as art thanks to platforms like ArtStation and Kickstarter. The technology has been controversial to say the least, and was notably used in 2022's sci-fi shooter High on Life.
Sungwon Cho (Borderlands 3) referred to the practice as "disrespectful to the craft," and added synthetic voices would take "the soul and spontaneity out of a real-life performance. [...] Actors should be given the option to not agree to their use.
These aforementioned clauses, according to National Association of Voice Actors (NAVA) founder Tim Friedlander, are for "non-synthetic voice jobs that give away the rights to use an actor's voice for synthetic voice training or creation without any additional compensation or approval."
In many cases, actors won't even know that AI clauses have been written into their contracts before signing until it's too late. But for those that realize this, he said that those actors "are being told they cannot be hired without agreeing to these clauses."
To him, companies prioritizing synthetic voices would "damage a large part of the industry," particularly for non-big name voice actors that populate games and animation. NAVA published a blog advising voice actors to read their contracts to ensure AI voice clauses aren't featured, and to alert union representatives should a clause exist.
Voice actors aren't here for the advent of synthetic voices
Motherboard's report focuses particularly on ElevenLabs, a software company that claims to "provide the necessary quality for voicing news, newsletters, books and videos.” Its co-founder Mati Staniszewski explained that he was working towards a future where AI and voice actors work cooperatively one another.
Fellow voice actor Fryda Wolff (Mass Effect Andromeda, Apex Legends) pushed against Staniszewski's statement, calling it "darkly funny. That nonsense jargon gives away the game that ElevenLabs have no idea how voice actors make their living."
She added that it could be possible for developers and animation studios to "get away with squeezing more performances out of me through feeding my voice to AI [...] then never compensating me for use of my ‘likeness’, never mind informing my agency that this was done.”
Last year, it was reported that multiple triple-A developers were already using AI programs rather than human voice actors. Prominent voices like Yuri Lowenthal and Ashly Burch pointed out back then the ethical murkiness of programs using the voices of real actors, and those speaking to Motherboard had similar sentiments.
Sarah Elmaleh (Halo Infinite) gave hypotheticals about actors not performing specific lines for moral or personal reasons, and said technology "obviously circumvents that entirely."
Elmaleh said that when it comes to performing, "consent must be ongoing." It's a similar concern that Lowenthal had about AI voices in 2022, and the lack of consent is something that AI communities, for both art and voice acting alike, appear to need much stronger conversations.
About the Author(s)
You May Also Like