When customers interact with Meya, they see the Meya Orb. It’s our customer-facing component of the platform that integrates with dozens of services through the Meya Grid. But you’re not limited to just text interactions. Meya v2 supports amazing voice interactions through integrations with Amazon Alexa and Google Assistant.
Interacting with technology using voice and virtual assistants is as common today as typing on a keyboard. But just two decades ago, having a conversation with a virtual assistant was something straight out of a science fiction book or movie. Stanley Kubrick’s 1968 classic 2001: A Space Odyssey showed us a vision of the future where astronauts and an artificial intelligence worked together in deep space. 2001’s HAL 9000 AI was visible throughout the film’s spaceship in the form of a fisheye camera with a red-eye. While there are some visual similarities to the Meya Orb, we can assure you that Orb will open the pod bay doors when you ask it to.
In the 1980s, the talking car KITT from NBC’s Knight Rider showed a conversational AI with sass. Knight Rider showed how interactions could be deeper than asking for turn-by-turn directions. KITT understood the questions it was asked and was able to provide nuanced voice prompts to help drive the interactions. (Turbo pursuit mode was also very cool.)
One of our favorite scenes of voice in the film is from 1986’s Star Trek IV: The Voyage Home. Chief engineer Montgomery Scott asks to use a computer to show the plans for the not-yet-invented transparent aluminum. He sits down as a 1980’s Macintosh SE and delivers the iconic line “Hellooo Computer.” Scotty is told to use the mouse–and thinking it’s a microphone–picks up the mouse and starts talking.
The ways we interact with voice assistants have evolved greatly since then. Today, our interactions are conversational, interactive, and at times used for humour. We use voice with an ever-growing number of virtual assistants and AIs–Siri, Google Home, Amazon Alexa, and more.
When we’re thinking beyond the bot, we look at how businesses and customers interact with each other. As a consumer, when you have a problem with a device, a bill, a flight….what do you typically do? If you’re like most people, you’ll be looking at a website to find a customer service phone number. Over 65% of consumers call into a support number to try to resolve their issue, with the remainder being handled over email and live or in-app chat.
That’s a lot of voice interactions–and it’s what’s driving us to revolutionize call centers across all industries. Today, our customers are using the Meya platform to create virtual agents that can augment or replace live agents in their contact centers.
When we’re talking voice, we’re talking about two related topics–telephony and voice assistance. You’d be hard-pressed to find someone who’s a fan of IVR (interactive voice response) or other phone tree systems. Sure, they’re useful - but the user experience is something that most people have a dislike for.
The Meya platform lets you create virtual agents, unlike any other platform. Our vision is that the virtual agent is available in the most convenient space for the customers. That could be in the browser with an engaging chat experience with a streaming service. It could happen within a mobile app using in-app chat. It might be in your car as you’re commuting and need to make changes to your insurance.
It’s a big vision - but we’ve built the foundation for this today with Meya. Your developers can now write bot interactions once and deploy anywhere they’re needed. These aren’t just text interactions. You can add voice interactions to your existing text-based flows using the same code.
There are of course some challenges to tackle. For one, the user experience on mobile and web for text-based bots are generally the same. With voice, there’s no user interface. For our team, this means looking at making a responsive voice experience–true conversational AI. Offering voice interactions means adapting the chat flow to do things like describing an image rather than sending it via chat.
Building voice assistant interactions exposes the lack of visual queues and quick replies too. When developers are building the interaction flows in Meya Grid, they can add in voice prompts with options to help the customer navigate the voice interaction. We’ve made it easy to add in that responsiveness. You don’t have to build a whole new bot. You can simply add in additional code that gets used when a customer uses Alexa or Google Assistant.
Localization is also an important part of the voice experience. In the code sample below, we use our DialogFlow integration to add translation directly into the chat flow.
1steps:
2 - (welcome)
3 - say: Welcome to the voice demo.
4 - if: (@ thread.voice )
5 then: next
6 else:
7 jump: question
8 - say: I can see that you're on a voice enabled device.
9
10 - (question)
11 - ask: Would you like a free t-shirt?
12 expect: dialogflow
13 integration: integration.google.dialogflow.dialogflow
14 error_message: >
15 Sorry, I didn't understand that. Would you like a free t-shirt?
16 intent:
17 - input_yes
18 - input_no
19 quick_replies:
20 - Yes
21 - No
These integrations are the beginnings of our vision to enable developers to build voice assistant support into telephony. Want to know more, click on the Orb to the right and we’d love to give you a live demo of Google Assistant and Amazon Alexa integrations with Meya.