It was in the newspapers. “Virtual assistants” like Amazon’s Alexa and Apple’s Siri — home electronics systems programmed to perform certain tasks at the user’s voice — have been found to be making decisions on their own as if they were the owners of the house. . And these decisions are not limited to turning the lights on or off, putting on this or that song and answering simple questions like “Alexa, who went up on the stock market today and how much are my preferred shares in Petrobras after Bolsonaro’s insane statements?” .
It seems that Alexa or Siri, one of them, has been placing orders without the owner, such as buying dishes, casseroles and other household items. Not coincidentally, a purchase similar to the one she had made a month earlier. As the person made certain purchases every month, the machine deduced that this regularity also applied to the apartment’s trousseau and asked for everything again. He also bought her a ticket on the Ponte Aérea for a flight that the girl was not going to take, but, coincidentally, she had taken a month earlier. The result was a card charge she hadn’t expected. And there are other such cases.
That’s the problem with artificial intelligence: it’s stupid. As it learns to perform tasks by repetition, the so-called AI thinks that human beings, by doing something twice, also have to keep doing them and at the same interval. Imagine if, having gone to the dentist twice that month, the subject will have to do this every 15 days forever.
And then there are the alien agents. I heard that, a few days ago, a DF reader’s Kindle was invaded by about a thousand red ants. They climbed into it, fired up something inside, and, perhaps because they were science fiction fans, ordered an ebook from Isaac Asimov.
I still don’t use a Kindle, but if I ever have one, I hope that, in terms of science fiction, my ants prefer Robert Heinlein.
LINK PRESENT: Did you like this text? Subscriber can release five free accesses of any link per day. Just click the blue F below.