

Right now we have to teach our Conversational AI platforms almost everything. The role of Conversation Designers will change over time.

That communicate your brand.Ĭonversation Design isn’t just about crafting prompts and defining intents in your favorite IVR, voice assistant, or chatbot platform. You need to have conversations that help your customers get stuff done. Because your business doesn’t need to have random conversations about Pluto or paper airplanes. Do we still need Conversation Design? Do we need Conversation Designers? As a rather interesting twist, it pretended to be a paper airplane! And Pluto (the dwarf planet). The chatbot had a very natural-sounding chat with one of the researchers. Google used a new type of neural network, called a ‘Transformer’, and fed it with data. That’s the question that popped into my head when watching Google’s incredible LaMDA demo. But only in certain, very narrow, situations.

And that will engage with and understand them. There are lots of gaps, and corner cases that can catch you out.Ĭonversation Designers, and the teams that support them, need to understand what Conversational AI platforms are capable of, and work around those constraints to build a bot that customers will engage with, and understand. Just because you can quickly build a bot, doesn’t mean you can actually build a bot to do what your business needs. But they provide an illusion of simplicity. The platforms try to make that easier with nice-looking graphical interfaces to build your bots with.
#OVERCOOKED! 2 NPS UPDATE HOW TO#
You also have to provide very precise instructions on how to behave based on what they think a customer said. You have to provide them with lots of examples to fill in the gaps. Conversational AI platforms have general models of language built in to help IVR and chatbots understand what a customer says. But only at the surface level. You don’t have to teach the bots everything. The latest AI algorithms make this easier.
#OVERCOOKED! 2 NPS UPDATE PLUS#
IVR, voice or chatbot platform plus data does not equal the capabilities of a human agent. Unless you provide very detailed instructions to it.Īnd that’s the issue. Nor to whom it should turn for help if it gets stuck. It might be good at chatting about folding paper and flying. It’s just a computer program that’s harnessing billions of examples of conversation to say something plausible. It’s not flexible, helpful, and resourceful, like us. When Google shows off an AI chatbot that pretends to be an airplane and appears to chat convincingly with their human creators (I wrote about it here) it’s easy to imagine those capabilities will extend to the chatbot we deploy in our contact center.īut Google’s chatbot isn’t intelligent like us. We can’t use our intuition to understand what AI is capable of. Like chatbots, that ask and respond to questions.īut chatbots share almost none of the underlying biology (infrastructure), DNA (programming) nor experience (data) that makes us who we are.Īnd that’s the issue. Imagine just how easy it is for us to over-generalize with AI that appears to do things like we do. We over-generalize with pets that have surprisingly similar DNA, biology, and brains to us. We complain that a dog is naughty when they’re just responding to the stimuli we introduce into their environment. We fall in love with cats, that probably don’t love us. Or anticipate and avoid an attack.īut we over-generalize. To do this, our brains try to understand what other humans are thinking, so we can anticipate and provide for each other’s needs. Humans evolved to create and maintain highly complex social groups. What went wrong? Everyone was so excited about the possibilities! Right there in the trough of disillusionment are chatbots. Gartner just released their latest hype cycle for AI.
