A chatbot is loosely defined as a computer program which conducts conversation via auditory and textual methods. It is derived from the word “ChatterBot”, coined by Michael Mauldin in 1994. There are several derivatives – verbot, smartbot, talkbot, IM bot, interactive agent, conversational interface, AI conversational entity, and so on.
Confused? No sweat, this proves immense possibilities of “Chatbots” – I will stick with this particular term. Interestingly, this idea was perceived in 1950 by a brilliant British mathematician and scientist with years of expertise, Alan Turing. While Turing theorized on text-based conversational interactions, the technology has evolved significantly over the years.
1966 ELIZA, 1972 Parry, 1988 Jabberwacky,1992 Dr. Sbaitso; 1995 Alice; 2001 SmarterChild; 2006 IBM Watson; 2010 Siri, 2012 Google Now; 2013 WeChat, 2015 Cortana Alexa; 2016 Messenger Bots, MS Tay, and so on.
So, what has changed? Well, in the last three to four years there has been an exponential proliferation of AI chatbot, an underlying technology component for delivering higher business value with human-like intelligence (the Turing criterion).
How do these chatbots work? Typically, there is a user interface that allows user to type in or ask using AUI questions (or phrases, expecting response), the same is processed by a NLP (natural language processing) engine to convert to a structured code. This code is then run on target system or application to get necessary result and reverse loop is completed similarly. I know my simplistic imagery of chatbots is at best inadequate, however, this should give some idea while instigating thoughts and ideas around the possibilities.
Let’s go few leaps forward. “Alexa/Siri – how is the weather?” User interface has changed from text to voice. That brings complexity of audio signal processing –where it is coming from (spatial determination), how much is the white noise (frequency and db. distortion, etc.). This is then passed on to an ASR (automatic speech recognition) engine that can sense differences between human speech and other noises. If you prank with a recorded voice – “Alexa wake up”, most likely Alexa will continue to sleep because ASR tells Alexa that it was not a human speech. This is then taken over by an NLU (natural language understanding) engine – a subset of NLP (natural language processing) domain. NLU (natural language understanding), a subset of NLP (natural language processing), uses AI techniques that can “understand” (and not only “translate”) speech. You say, “I am tired, can you play comfortably numb.” The engine will not only respond based on the learning, it will understand your question. Next time you say, “I am done for the day, man!” it can come up with “Shall I play a Vivaldi, it may make you feel better.” Conversations need not be restricted to text, audio, it can be sign-language too.
Dr. Prem Natarajan of Amazon Alexa believes every day is the first day for innovation in this field till the time we reach the level of AGI (artificial general intelligence) or superintelligence. There are new frontiers opening every day, such as phenomenon modelling (2014), intelligent decisions, reasoning, and autonomy to AGI (artificial general intelligence) trajectory. Here, the question lies, “When?”
Martin Luther King’s phrase “The fierce sense of urgency now” may encapsulate what is going on in the field of “chatbots”. The other day, IT leader of a network storage company, suggested that he is chartered to deliver more than 200 manual intervention points per year through AI chatbot. Welcome to the world of chatbots. According to Gartner, by 2020, 80% of businesses will use some form of chatbots. Hold your breath, an average person will have more conversation with bots than with their spouse – eeks!
The banking app on your phone, has a little bubble which encourages you to chat. You go to an online music store, a pop-up comes, saying “Can I help you?” Some of these assistants are quite mature and some are at its nascent stage. Banking industry is the front-runner in this area, whereas, insurance and other public services are not far behind. Assistance are the most visible use cases in the field of customer service. We are observing an extension of the capabilities (or skills) in customer service too. For example, paying a utility bill. A customer goes to the utility provider’s portal, and asks for his outstanding payment. The answer comes “you owe $< n > and your bill is outstanding for < x > days.” Extend further to “Can I help you pay your bill?” Right there is the use case for improving days sales outstanding (DSO). Reservations, purchasing, purchase recommendation, FAQs, etc. are the usual suspects.
While chatbots are generally known to automate tasks, these are now designed to deliver “experience.” Chatbots are transforming employee engagement - onboarding, training, process orientation, benefits/enrollment, etc. This is an area where tasks and experience delivery are getting streamlined by delegating activities to chatbots.
Employee: “Mila, I am sick, cannot come to work”
Mila: “Ok, take care, get well soon”.
Mila is a chatbot at Overstock.com, helping its call center folks with schedules, time offs, respond to “sick” call-ins. Let us extend this use case a bit. A field rep reports “Emily, I am running late by 45 minutes,”
Emily – “I noticed there is a traffic situation in your route, no worries, I am sending you a new route, you may reach their instead.” In the process, Emily proactively checked the GPS data, informed the manager, triggered a schedule/routing application, and updated the timesheets, etc. Extensions can be made where access to devices may be difficult, e.g., providing interactive step by instruction to a support tech fixing radio equipment on a 70-feet telecom tower.
One would instinctively think chatbots are reactive – it can give intelligently answer my questions and become smarter everyday – now, what next? Pro-active chatbots are a reality today. The assistant app can alert you – “looks like we may miss the shipment, we have some stock in other warehouse, I can ship it from there, it will cost a bit more, but this is an important customer, may I?”
How would the future of chatbots look like? I would argue that technology has increased user surface area and would continue to do so. Essentially, user can interact with and experience various shapes and forms of intelligent machines (I am including chatbots too). Possibilities are limitless – therefore, the question is irrelevant. There are chatbots which does not require you to “chat.” Amazon Look, for instance, looks at you (visual) and then matches clothes from the store, additionally simulating how you may look in each of your choices, help you in your buying decision. Other trend which is emerging is cross-platform interactions. Your interaction and experience are not limited to one “bot”, it traverses through multiple “units’. There are open SDKs/APIs to create nuggets (Prem calls them “skills”), which can be consumed by other applications/systems over the air (cloud). The approach here is to abstract “integration” to deliver experience though cross-platform traversal.
Chatbots are integrating with robots. You may “chat” with a CNC robot, instead of pressing some colorful switches – reminds me of Jim Cramer’s Mad Money. Autonomous mobility (let’s call it “driving” for now) will extensively use chatbots as the front-end - “I want to go to pick up my kids from school and then take them to the park which is not too crowded.” Healthcare opens up innumerable possibilities and can solve fundamental issue of increasing costs. One can talk to a specially designed machine (a collection of sensors, chatbots, etc.) that interact with wearable devices (e.g., iWatch’s single electrode ECG monitor), diagnose the problem, and may prescribe medication or call for the EMS.
There are concerns relating to privacy and security. There are questions whether these technologies will erode consciousness and conscience. We will have to take a human-centric design approach before jumping on to fancy tools and engaging geeks. Remember command line language? That’s where we started our first textual interactions with an intelligent device. We moved to graphical user interface first and then to browsers. Chatbots in different shapes and forms will soon replace all of that, simply because we now have the capabilities to increase naturalness by using ambient interactions. In 1950, Turing said, “A truly intelligent machine would be indistinguishable from a human during a text only conversation.” We have gone way past text-only conversation. There are a few questions still unanswered.
Can a chatbot become a persona and represent the person and walk around the factory floor? Can you interact with the hologram of your friend? Can chatbots spread love?