Computers unsettle us when they do things we don’t expect. Take the case of a Reddit user who goes by the screen-name “barney13.” When he asked Google Now – the intelligent assistant he accesses through his smartphone – to show him travel pictures he had taken in Nice, France, the digital helper showed him the photos, but also expressed its sympathy for the loss of his father, who had died in the city back in 2010.
What happened? After verifying the interaction with a video he posted on YouTube, barney13 explained that Google Now had learned about his father’s death via an email he received shortly afterwards. The request for photos from Nice sparked an immediate search through his data, connecting the city with the event. Finding the message of condolence “a lovely moment,” barney13 may be an outlier. Many people find such machine-human interactions bizarre.
Not me. I think intelligent assistants have a great future and I benefit daily from dealing with Google Now. But we have to work out the kinks here. Microsoft lit up the Internet recently by releasing a “chatbot” called Tay. The idea was that users would talk to the bot through Twitter, and it would use artificial intelligence to respond. Unfortunately, Tay went ballistic as hackers taught it how to spew racist rhetoric and otherwise mimic a Nazi gauleiter.
Wave of bots
Microsoft pulled Tay offline, tweaked it, tried again and failed. But this story is not going to end here, because Tay is only the forefront of a wave of bots that are headed our way, not just from Microsoft but from the Googles and Facebooks of the world. Microsoft CEO Satya Nadella is quite clear about this, telling the Build conference in San Francisco in late March that the company has created a new Bot Framework to help developers build more chatbots.
The vision runs something like this: The age of the app is about to be supplemented by a new model. Apps are great for specific functions and let you escape the complexity of desktop programs. But apps are best at delivering data rather than manipulating information. What happens when we want to ask our smartphones to do more complex tasks, like booking us on a flight, or getting us restaurant reservations? While an app makes us fill in user fields or sign in to accounts, a bot can pop up inside a messaging program and deliver the needed answers.
The bot, in other words, anticipates your needs. In Asia, messaging apps like WeChat are wildly popular, allowing users to make hotel reservations, hail cabs or buy products using text messaging. Microsoft plans to enable Skype to run chatbots that pick up on your conversation and can offer a variety of services matching the dialog. Managing the coming proliferation of chatbots to make sure they deliver only relevant information is clearly Microsoft’s goal.
Does this remind you of anything? Back in the 1990s, Microsoft created Clippy, a talking paperclip that would pop up in Microsoft Office to ask you questions. Like Tay, the digital “helper” went off the rails after user complaints multiplied. But both Tay and Clippy, with a pedigree going all the way back to a 1960s chatbot called ELIZA, are harbingers of what the big computer firms are hoping to build: Artificial intelligence that finally gets human interactions right.
Microsoft’s Bot Framework points to a future where bots proliferate in widely different applications, spotting from your online actions what you need and supplying the information. Trying to reverse its declining fortunes in the mobile market, Microsoft is trying to create a service model using bots that is agnostic as to platform. In other words, it won’t care whether you’re using an iPhone or an Android device, much less a Windows phone. It will simply offer the bot tools to do what you need while tapping its own cloud services for data delivery.
Only time will tell whether Microsoft will parallel WeChat’s success with bots in China. But let the lesson of Clippy be remembered. People love to be served but they hate to be annoyed.
Paul A. Gilster is the author of several books on technology. Reach him at firstname.lastname@example.org.