Intelligent machines are taking on the world.
And slavish Hollywood, who once gave us a warning with “The Terminator,” has this to say: Thanks, Siri, may I even have one other?
Based on a report within the Latest York Times, the actors’ union SAG-AFTRA is rightly enraged by a chilling provision in some Netflix contracts demanding performers sign over their voices to the studio to be digitally recreated — potentially for whatever it pleases: movies, TV shows, ads, anything.
The outrageous requirement amounts to the potential AI-ification of acting — digital carbon copies that rob performances of spontaneity, heat and, well, humanity. The shameful death of art is unfolding before our very eyes.
The dystopian documents ask permission for the signee’s unique sound to be artificially replicated “by all technologies and processes now known or hereafter developed throughout the universe and in perpetuity.”
Throughout the universe? Is Netflix’s head of legal the Ghost of L. Ron Hubbard?
Well, that request has definitely got me as rankled as once I watched “Battlefield Earth.”
Netflix told the Times that the voice provision is supposed for cases during which a voice actor leaves an animated show, allowing them to seamlessly transition to a latest person.
However it’s depressingly easy to foresee a future during which actors recurrently hand over not only their voices, but their physical likenesses to greedy employers — for use for way more than toys and other merch.
How alarming it’s to see this creepy practice of human digitization change into a suitable procedure, reasonably than a hasty and controversial workaround to finish a movie when an actor dies, resembling Carrie Fisher in “Star Wars Episode IX: The Rise of Skywalker” or Oliver Reed in “Gladiator.”
Even in those desperate circumstances involving sensitive family and estate negotiations — not to say good taste! — the replicas mixing stock footage with advanced technology give everybody the heebie-jeebies (that’s, everybody except William Shatner, who dreams that an intelligent, computerized double of himself will likely be made ASAP).
It was definitely weird when Andy Warhol’s chatter was freakily recreated by a pc for the recent documentary “The Andy Warhol Diaries.” At the least Artificial Andy read aloud the artist’s actual writings.
Worse was when the Anthony Bourdain doc “Roadrunner” did the identical with 45 seconds of narration from a fake version of the TV host and writer. It understandably caused an uproar.
But struggling studios don’t care and are giving the finger to art increasingly more often.
So it won’t be a shocker to see voice-and-body duplicates of dead or aging stars used to increase dusty franchises and pay flesh-and-blood talent less — or in no way.
This summer’s “Indiana Jones and the Dial of Destiny,” from Disney, begins with a de-aged version of 80-year-old star Harrison Ford.
Hollywood’s opportunism also presents a downside for audiences: Hassle-free AI actors that look and sound similar to the actual thing means Hollywood can abandon creativity even greater than they have already got.
Studios will happily churn out terrible sequels until the 4 horsemen gallop into Burbank.
With CGI human beings becoming increasingly realistic, why not have a fake Fisher (the actress died in 2016) play Princess Leia in her 20s many times?
And what’s going to stop NBC from producing 35 latest seasons of “Friends” where no person ages and Jennifer Aniston doesn’t should be paid a dime. Old “Friends” episodes already sound like ChatGPT prompts.
My nightmares worsen. Soon, Movie Critic ChatGPT will likely be reviewing “Daddy’s Home 47” written by Screenwriter ChatGPT and starring AI Mark Wahlberg.
Do I sound like a hysterical Henny Penny? Possibly. But AI and the speed at which it’s pervading the workforce and popular culture has eclipsed common sense reservations concerning the possible consequences.
One source told me that on a recent Netflix film, 360-degree captures of each actor were made. The stated rationale was to have a backup in case one among them kicked the bucket.
Seems like a bucket of BS to me.