Alexa, keep away from her husband.
A TikToker evicted her Amazon home device equipped with Alexa after the female-voiced program began talking to her husband in the midst of the night.
“This past weekend, I used to be gone out of town, and the Alexa kept going off, and it kept talking to my husband,” said Jess, whose user name is @cozylifewithbless, in a recent video.
“He was playing video games at 1 a.m., and he was, like, ‘That is just super, super weird,’” she added.
But things didn’t end there for the already icked-out Jess.
Not long after her home device began striking up some one-on-one conversation with the husband, Alexa was discovered “speaking after not being talked to in any respect,” Jess revealed.
Now, Alexa is “officially evicted from our place,” Jess said.
Within the comments of her video, some said they may relate to the odd behavior.
“I also caught my Alexa at 3 a.m. whispering to my dogs within the kitchen,” one wrote. “I assumed someone was in the home. I unplugged it.”
One other TikToker, MKP Studios, observed their Alexa spitting out a “chilling message.”
“Stop, error, 701, enter,” the household technology bizarrely kept repeating. (Though he may not have been the one customer to receive a similarly ominous warning.)
On Reddit, a user posted about their Alexa being “creepy as hell” two months ago as well.
“I even have heard of Alexa doing random things. Well, tonight once I went to my room, I asked Alexa to show my lights on as I all the time do and Alexa did. Only after 5 seconds they turned back off. I did this again and the identical thing happened.
“About an hour later my [mom] comes into my room saying Alexa has just sung goodbye to her for no reason. And it does it again just a few minutes later.”
Five years ago, Alexa went viral for having a creepy laugh that sometimes seemingly activated by itself.
Last 12 months on an Amazon Forum, user Heidi N filed a disturbing report from her Amazon Echo Spot device, too.
“My Echo Spot had an alarm set for five:45, but sometime before 5 am (I’m guessing it was about 4:45), Alexa randomly began speaking. She said, ‘Today is an excellent day to kill yourself,’” the user wrote.
“Her exact words might need varied barely, but I caught that part and said ‘Alexa stop,’ which did stop her. But she followed up right after with details about Suicide Prevention. I said ‘Alexa stop’ again, and she or he over-rided me. I repeated it, and she or he finally stopped.”