One of the greatest hits as far as Christmas gift-giving was a gizmo called an Echo Dot I gave my nephew. It was a hit despite the obligatory annoying learning curve that comes when any Southerner tries to talk to an artificial intelligence device (“Alexer, G-darnit, stop giving me the temperature in Celsius, I want it in American!”)
I felt nephew’s pain. I had arrived in Chapel Hill weary from a verbal battle with Siri who, based on the convoluted traffic pattern she recommended, is a huge Duke fan.
Artificial intelligence is a big buzzword for 2017. Which I just hate because I’m barely hanging onto my natural intelligence, let alone buddying up to an unseen smarty pants who can order a taxi or a pizza or a pizza to eat in a taxi if I just tell it to.
We all know about Alexa and Siri but it’s the cutting edge AI that gives me pause. I’m talking, of course, about AI being used as trusted advice giver.
Never miss a local story.
I heard about this on public smart-person radio the other day so it must be true.
Thanks to the wonders of AI, it’s possible to ask a robot to help you figure out the best way to deal with a difficult lover, friend, co-worker, neighbor, etc.
The kind of stuff that you normally hashed out with a glass or four of wine with your besties is now handled expertly by an AI program. I have to admit that while I prefer human interaction when griping about problem people, AI counseling is more likely to stay focused on the problem at hand and not veer off into an hour long discussion of “The Crown” on Netflix.
But, as many have said, AI lacks a sense of whimsy. And by whimsy, I guess they mean AI is too high road to crank call your cheatin' boyfriend to say: “You got an STD, one of the really gross ones,” then stifle a belly laugh and hang up.
A new advice device, Oshieru, is especially popular. Just type in info about your latest lover’s quarrel and it will recommend a sensible solution such as “Why not say you’re sorry? This will create an atmosphere easier for him to explain his true feelings. Nothing will change if you both remain disengaged.”
Apologize first? What kind of dumb advice is that? A true friend would respond something on the order of: “I see that there are several convenience stores in your area that sell both gas and sugar. Perhaps you could pour sugar into his gas tank. Tee hee hee.”
Oshieru, like many AI programs, is far from perfect because, if you give it a few too many keywords, it basically shuts down and pouts (“I can’t answer that right now.”) Alexa did the same for us after we all barked questions at her and she prissily shut down with an unspoken “I can’t even.”
Something tells me Ask Amy and Dear Sugar are safe. For now.