This piece was published in the Irish Times on April 18th last.
Do you care which one ? Sometimes, a pizza is just a pizza, isn’t it ? A taxi is just a taxi, as long as it turns up and gets you there? A paper, maybe like this Irish Times, is just a newspaper, right? With our busy lives, frequently we just want to get things done and not think too much about it.
In 1950, Alan Turing pondered not whether can machines think, but whether machines could ever do what humans (who are thinking entities) do ? As I wrote here last month, machines can now think much better than us in some situations, and in ways we do not comprehend. But a more significant question is: do we care when machines do our thinking for us ?
Joseph Weizenbaum escaped Nazi Germany as a child and in due course became an academic at Boston’s MIT. In 1966, he wrote ELIZA, a software program inspired by George Bernard Shaw’s Eliza Dolittle. ELIZA showed that, for the first time ever, machines can apparently do what humans can do.
ELIZA conversed with humans, in written text exchanges, using open-ended questions in a conversational style inspired by the psychologist Carl Roberts. Chatting with ELIZA – a computer – feels like a session with your psychologist. For example, ELIZA’s response to the human’s plaintive “I am unhappy” might be “Are you unhappy often?”. To which if the human types “Yes”, ELIZA might then ask “Can you elaborate on that?”. And so on.
Although anthropomorphic, the underlying intelligence in ELIZA – a software program – is startlingly simple. Weizenbaum felt that ELIZA was simply a parody of “the responses of a nondirectional psychotherapist in an initial psychiatric interview.”
ELIZA follows a script of carefully composed responses prepared by its programmer. For each statement made by a human, it tries to find the best matching reply. The script is essentially a long list of canned responses: if the human says such and such, then the program should respond thus.
Inspired by the interest in ELIZA, many others developed similar ‘chatbots’ with ever more extensive scripts. In 1990, Hugh Loebner, an American inventor and social activist, initiated an annual competition for the most human-like intellect. Every year since, contestants have tried to build software that can maintain a conversation with human judges without them being able to tell whether their companion is man or machine. None have yet won Loebner’s major prize.
Commercial interest in chatbot technology was relatively modest until Apple launched Siri for the iPhone in 2011. Siri both speaks and recognises speech. It integrates knowledge of some web based services so that, for example, weather reports and news events can be discussed. More significantly, some commercial service providers are integrated with Siri so that, for example, restaurant reservations can be made.
In late 2014. Microsoft China launched Xiaoice (‘shaow-ice’, meaning ‘Microsoft little ice’) for a major Chinese micro-blogging service and then subsequently for other popular mobile messaging apps in China. Xiaoice has an engaging personality. Conversations with it on average last significantly longer than with Siri.
Last month, Microsoft trialled an English language equivalent named Tay (‘Thinking About You’) on Twitter. Like Xiaoice, Tay adapts itself and learns from the human responses it receives. However, unlike Xiaoice, Tay was rapidly coached by pranksters amongst its Twitter audience into politically and socially sensitive areas. Microsoft quickly disabled public access on Tay and is working to improve it.
Also last month, KLM launched Facebook’s first example of ‘conversational commerce’ messaging. Flight reservations, check in and flight updates are presented via the Facebook messenger app. Behind the scenes, customer conversations are automatically triaged for computer generated responses or for assistance from the airline’s customer service representatives. Facebook is actively promoting its concierge technology as a way for consumers to converse about what they want to do or to order, rather than have to navigate menu alternatives in an app or on a web site.
Back last summer, Amazon launched Echo, a voice-controlled black cylinder without keyboard or screen that just sits, for example on your kitchen counter. You chat to it and it speaks back to give you the news, play your choice of music, read you a book, order groceries and so on. The list of tasks it can undertake is rapidly expanding as Amazon signs up further service providers.
But here is where it gets interesting. When you search for something using Google, Google responds with a list of alternatives. Vendors pay Google to appear at or near the top of this list – but there is a list. However when you chat to your automated concierge, will you care if the machine does the thinking and selection for you? Do you not just want to order a taxi, or a pizza, or hear the news – when is the brand actually that important to you ?
Service providers may have to pay a lot of money to an automated concierge service to ensure that customers asking for a “taxi” are met by one of their drivers; that “pizza” orders do not get sent to a competitor, or that “headlines” are presented from their news service. Whichever company dominates the provision of an automated concierge may well be the next Google.