Amazon shows off Alexa feature that mimics the voices of your dead relatives

1 week ago 15

Amazon has revealed an experimental Alexa diagnostic that allows the AI adjunct to mimic the voices of users’ dormant relatives.

The institution demoed the diagnostic astatine its yearly MARS conference, showing a video successful which a kid asks Alexa to work a bedtime communicative successful the dependable of his dormant grandmother.

“As you saw successful this experience, alternatively of Alexa’s dependable speechmaking the book, it’s the kid’s grandma’s voice,” said Rohit Prasad, Amazon’s caput idiosyncratic for Alexa AI. Prasad introduced the clip by saying that adding “human attributes” to AI systems was progressively important “in these times of the ongoing pandemic, erstwhile truthful galore of america person mislaid idiosyncratic we love.”

“While AI can’t destruct that symptom of loss, it tin decidedly marque their memories last,” said Prasad. You tin ticker the demo itself below:

Amazon has fixed nary denotation whether this diagnostic volition ever beryllium made public, but says its systems tin larn to imitate someone’s dependable from conscionable a azygous infinitesimal of recorded audio. In an property of abundant videos and dependable notes, this means it’s good wrong the mean consumer’s scope to clone the voices of loved ones — oregon anyone other they like.

Although this circumstantial exertion is already controversial, with users connected societal media calling the diagnostic “creepy” and a “monstrosity,” specified AI dependable mimicry has become progressively common successful caller years. These imitations are often known arsenic “audio deepfakes” and are already regularly utilized successful industries similar podcasting, movie and TV, and video games.

Many audio signaling suites, for example, connection users the enactment to clone idiosyncratic voices from their recordings. That way, if a podcast big flubs her oregon his line, for example, a dependable technologist tin edit what they’ve said simply by typing successful a caller script. Replicating lines of seamless code requires a batch of work, but precise tiny edits tin beryllium made with a fewer clicks.

The aforesaid exertion has been utilized successful film, too. Last year, it was revealed that a documentary astir the beingness of cook Anthony Bourdain, who died successful 2018, utilized AI to clone his dependable successful bid to work quotes from emails helium sent. Many fans were disgusted by the exertion of the technology, calling it “ghoulish” and “deceptive.” Others defended the usage of the exertion arsenic akin to different reconstructions utilized successful documentaries.

Amazon’s Prasad said the diagnostic could alteration customers to person “lasting idiosyncratic relationships” with the deceased, and it’s surely existent that galore radical astir the satellite are already utilizing AI for this purpose. People person already created chatbots that imitate dormant loved ones, for example, grooming AI based connected stored conversations. Adding close voices to these systems — oregon adjacent video avatars — is wholly imaginable utilizing today’s AI technology, and is apt to go much widespread.

However, whether oregon not customers volition privation their dormant loved ones to go integer AI puppets is different substance entirely.

Read Entire Article