The 5 Categories that Rule Virtual Agents
It just so happens that whenever we design and deploy a new AI-powered virtual agent over voice, the self-service application invariably falls into one of 5 distinct categories. This has been true for more than a hundred clients encompassing hundreds of use cases across 12 industries.
Having a keen awareness of where or how AI fits in your contact center for a great CX and ROI begins with understanding how to broadly classify your interactions. If it fits into one of these 5 categories, a live human agent should never handle it.
What You Will Learn:
- How to categorize interactions for AI applicability
- Understand top use cases driving conversational AI adoption
- Real-world examples from 6 leading companies
- How to avoid the biggest mistakes with AI-powered virtual agents
VP of Product,
Chief Marketing Officer,
Director of Customer Success,
The 5 Categories that Rule Voice Virtual Agents
Brian Morin: I appreciate those that are joining in for the time. We do hope to make this dense with content for you and an interesting segment walk away with some real takeaways as opposed to maybe some fluff. Just for some context today, here at SmartAction we are one of the leading conversational AI self-service providers in the space. We operate natural language automation for more than 100 brands across many different industries. I don’t know of an industry that we’re not in. And we leveraged the very latest in conversational AI capabilities that allows us to go really, really deep particularly in the complex call types that you wouldn’t even think could be automated, but are being automated today.
Now, that we are in this position where we’re one of the most mature your conversational AI providers in the market, meaning that we have now built more than 100 self-service applications. One of the really interesting things that we did recently is that we stepped back and we looked at all the applications that were required of us to the build and there was one thing that we found that was really interesting about this dynamic is that we were able to bucket nearly all of them into one of five distinct buckets. And usually, the first question that we do get from a contact center leader or sometimes this might come from IT or someone in ops is, where to start their automation journey. And invariably it ends up being an interaction that we can fit into one of these five buckets, not always but the vast majority of the time the answer to that is yes.
And just very quickly, if you’re not familiar with SmartAction and you want to think what is our hubris to think that we have something to say on the subject. We deliver AI powered virtual agents as a service, so that means we deliver the full conversational AI technology stack. It’s turnkey. Even though the topic of today is voice. We don’t just do voice, we actually do omni-channel, but we typically start in voice before scaling digitally. But we don’t just sell software licenses receipts. We actually bundle the technology stack with end-to-end CX services. And that means everything; the design, the build, the ongoing operation.
So, at the end of the day, we really are stepping in more as a partner rather than just the technology provider and ultimately that’s what makes us responsible for delivering the ROI that’s promised, delivering and the CX that’s promised. From a partnership standpoint, we don’t have the luxury to over promise and then under deliver. And so today as mentioned, we operate the AI powered CX for more than 100 brands. We are on the front row seat on the battlefront doing this for on behalf of brands. We’ve done a lot of journeys from “press one” to “how can I help you today?” And ultimately, this is why we deliver a technology as a service, because doing voice well is enormously complex you need more than great technology.
You do need experts in their field in running it. And we will at least like to think that this approach is working for us. We’re now the top rated virtual customer assistant solution on Gartner Peer Insights as rated by customer reviews. If you have any interest to know what customers are actually saying about their experience, that’s the place you can go and get some third party validation. Let’s go ahead and set the table for our conversation today. Starting really first talking about your capabilities before we can really understand what these five buckets are. And then I’m going to introduce a couple of our domain experts here in their fields with Mark and Marilyn.
The important point to make here on this slide is that when somebody is calling in from a behavioral standpoint, they’re starting from a position of tension. And that’s really important to know, because you have to immediately do something where you are combating that tension with technology. When they are greeted personally in natural language, by a voice that sounds like a human that demonstrates intelligence by showing in this case, Lindsay, that we know something about her already, we’re doing something predictive in nature, whatever tension she had is quickly reduced because she actually has this glimmer of hope that maybe this system that she’s interacting with might be smart enough to resolve her issue and resolve it quickly.
And that makes her far more inclined to interact with the virtual agent than to zero out and wait on hold to speak for human. If you look at this slide, I’m not going to go through each one of these that would take too much time. This just gives a snapshot on core capabilities that you can expect at your disposal that causes interactions to fall into one of five distinct buckets, essentially think of it this way. It’s all the skills needed to mimic live agent behavior, minus the things that you could only get from a human, which would be; human judgment, complex, critical thinking, persuasion piece along those lines.
The biggest thing that your contact centre decision makers typically don’t know about conversational AI, at least over voice when it’s a great big black box and what’s unknown is all the complexity that’s involved in delivering a great voice experience because it is really, really easy to deliver a bad voice experience. So, from just a technical standpoint, at least in our case, we are literally tuning and customizing the AI brain for every individual question we ask in a given interaction and tuning for exact known responses that will get in return so we can get the highest accuracy possible because transcription engines over telephony just don’t work very well.
And so that means building up domain specific language models for every interaction, weighting them against grammars utterances that we expect or predicted to hear. In fact, if we stumble into an interaction that we’ve never automated before, it’s an eight-week development cycle, just to train those models and customize. There is no easy button to doing voice well over telephony. We will be the first admit it’s not very scalable approach, but it is the only approach where you can deliver a really good experience over voice. And that’s before we even get into all the other nontechnical elements, it’s the human centric, CX design portion of establishing handling rules and swim lanes for virtual agents, that we know will deliver the perfect experience that is as good or better than a live agent.
And then of course, all the monitoring and reporting that goes on over time, day-in and day-out that engages that process of iteration where you are continuing to improve over time. Let’s go ahead and jump into the very first category of virtual agents. I’m going to handle this category myself before I throw the subsequent categories over to my cohorts. This first category is in the area of billing and payment and collections. Now, you might look at this from a billing or payment standpoint is thinking, yes, there is a portion of that, that’s very simple. But look at what we’re showing here beyond just the simplest portion of billing and payment. We’re actually talking about going much deeper in areas like collection that actually involves a back and forth negotiation.
Something that before today you would never even consider to automate it, yet this is becoming one of the most popular applications that we’re building simply because it is too expensive to use humans in the area of collections. But if your only choice is a robocall or a simple text, there is no back and forth negotiation dynamic capability to actually handle that. We’ll play just a very quick call to give you an example of what this looks like. We can level set on what AI is actually doing on the bleeding edge of this field today.
IVA: Hello. This call is from Nation’s Healthcare. Your account requires immediate action. Is this the residence of Jennifer Moore?
IVA: Am I speaking with Jennifer?
IVA: It’s very important that I speak with Jennifer regarding their Nation’s Healthcare account. Is Jennifer available to speak?
IVA: Okay. I’ll wait.
IVA: We have important information about your Nation’s Healthcare account. Due to the sensitive nature of your account information, we will need to verify your identity for security purposes before proceeding. Please provide your date of birth.
IVA: Thank you. Our record indicate that the account with Nation’s Healthcare for Jennifer Moore is past due with a balance of $245.44. You can pay this balance of $245.44 with a credit card. If you make a payment today, I can waive the late fee of $15. Would you like to make a payment for $230.44 today in order to avoid further collection action?
IVA: It’s crucial that you make a payment today to keep your account from further collection activity. I can offer you a payment plan option today. With a payment plan option you can securely provide a credit card on file and a fixed amount will be charged to the card each month until the balance is paid off. Would you like to hear payment plan options?
IVA: We can offer you a payment plan option of two payments of $116 each. Would you like to accept this option?
Customer: No. Too high.
IVA: We can offer you a payment plan option of four payments of $58 each. Would you like to accept this option?
Customer: No. Still too high.
IVA: I can take a partial payment with a credit card. Would you like to make a partial payment?
IVA: How much would you like to pay via credit card today?
IVA: You can pay with American Express, Discover, MasterCard.
Brian Morin: So, what I’m going to do is I’m going pause that there. She has that interaction with the virtual agent to offer the payment, but I should point out that’s not a fake call that was baked. That was a real call from Brightree customer that we support in that area. I think that you can see the next level of intelligence that’s actually happening in natural language that’s enabling this back and forth capability to follow through a decision tree of a conversation flow to accomplish exactly what’s needed for both the caller and the company.
So let’s jump into category number two, that rules of voice virtual agents, this area of what we call status and inquiry. I’m going to tee-up Marilyn. Marilyn, this is actually probably one of the simpler call types that we handle. There usually is in a lot of complexity that goes into this, but for organizations that have yet to step into the pond of conversational AI, this is a very simple transactional call that many, many live agents are taking and it simply makes just no sense for them to take those calls if AI has access to the required data.
Marilyn Cassedy: Absolutely. This is an easy, low hanging fruit type call for most organizations to handle in an inbound direction. We know that almost every call center is getting some kind of inquiry about whether my package has shipped, whereas my insurance claim in your process, that kind of thing. What a lot of people don’t think about as a sort of second step in that process is then also automating an outbound call that can prevent that from ever reaching your contact center in the first place. We can do those sort of in tandem, both handling those inbound status calls through automation and deflecting them.
We can also be proactive in that and send either an SMS or a voice call out to folks and just let them know their order has shipped and they should expect it on this day. It’s a really cool way to not just solve the problem of that call, but then also be a little more proactive and think about some additional value you can provide the customer.
Brian Morin: Yeah, so need for an example, on status or inquiry, I think all of us would be bored listening to it. Let’s just jump into the third category that rules virtual agents, it’s scheduling. We never ceased to be amazed at how often scheduling is a high volume interaction across many different verticals. We do this for a lot of customers. We’re showcasing just one example on screen and the problem for Penske prior to using AI for scheduling, is that they were using what we would just call dumb robocalls or SMS alerts.
Marilyn, that would go out if the customer had to either change a reservation or they wanted to change a truck, or they want to change location, they couldn’t do that with those systems. And that would essentially generate even more inbound calls that would come back into their contact center. Marilyn, behind the scenes, there’s a lot of logistics and complexity that’s actually going on to pull this off.
Marilyn Cassedy: Exactly. Right. This is a great example of needing to have your outbound program working in tandem with your inbound program to make sure that you’re able to deliver what the customer expects. Penske as much as they’re moving companies, they’re also a logistics company, they need to make sure they have the right sized truck in the right place at the right time. And the more of an open dialogue they can have with the customers, the more communication there is through the channels available to them, the more likely they are to hit the mark and have a successful interaction with the customers.
So changing out those sort of robocall type experience and their outbound for a more proactive automate able conversation that a customer can have to change their truck round has been huge for them. What we were able to do is sort of an add on there is actually had some upsell opportunity like, “Okay, well you’ve got the 26-foot truck. Does it make sense for you to add a handcart or some furniture pads at point?” And that’s really provided a lot of value for them just because now they’re able to automate some of the even better parts of the agent conversation that help to drive revenue for the business.
Brian Morin: I did include it in example, I noticed that we have about 10 minutes left until we’re at the bottom of the hour. I think maybe instead of taking the time to play it now we can wait until the Q&A should somebody want to hear it during Q&A but you can think of the similar to be in the collection call where there’s back and forth negotiation. We understand things like next Friday at noon, next week, certain dates and the AI allows the human and natural language to have the same back and forth negotiation on scheduling as they would with the human.
So let’s jump into the fourth category that rules voice virtual agents. This is something that we call an intelligent front door. Mark, I’m going to throw this your way to step us through this. Whenever we mentioned intelligent front door and frankly, nearly every customer that we work with has/uses an intelligent front door. But usually the first question is, Mark, well, what is an intelligent front door?
Mark Landry: It’s a good question, Brian. What we find in our implementations is that usually a client will have, just like you said dumb IVR, which is a push button. They will either say, “Well, SmartAction. Does it make more sense for you to be the first thing that the caller hears? Or should you be an offshoot for insurance, press two, and then it goes to SmartAction.” And we always say, we should be the first thing the customer hears and that’s the front door. It’s like, think of it as your house. It’s the first opening into, is the first impression into a home. And if you walk through somebody’s home and the first room is decorated by one person and then the next room is decorated completely differently, it’s just going to be an off putting sort of jarring experience.
So, giving the customer, the caller that very consistent experience all the way through, which starts with a human empathetic approach from the very beginning, from the front door, that’s what we call it, the intelligent front door is going to create that branded warm feeling that you said, Brian the relief of tension that the customer will feel right from the jump. And then we can go into intelligent, natural language, intent capture. “Hi, Mark.” If I know your number already and I can look up your ANI and get your information. “Hi, Mark. Welcome back to Acme Co. How can I help you today?” “Well, I need to,” blah, blah, blah. “Okay, let’s get you set up.”
Brian Morin: Right. Let’s go ahead. And I’m going to share just a reference architecture of what that actually looks like. You can see that for on the customer end, whether over chat or voice, they’re calling in and the very first interaction lands on our network with our AI, when they are asked the question, “How can I help you today?” They’re capturing that intent natural language, they’re working through the authentication piece and then they are making the transfer with context to the appropriate live agent, or if it’s an attempt to be handled by self-service. Well, then it just stays with AI, where it’s either going to complete that interaction or go as far down that interaction as needed before the transfer to a live agent to pick up where AI left off. At some cases we see for most customers, just this action alone ends up shaving about 60 seconds off a call.
A good example is that we work with is Electrolux the second largest appliance manufacturer in the world, the umbrella brand for Frigidaire will pull in others. What they do is they use this intelligent front door, but then they do a data dip to find out, has this customer actually, have they ever registered their product that they’re calling in about. If that check mark is not there well then instead of the live agent, having to take them through that product registration, well then the AI is able to do that. And in those instances, it shaves off more than two minutes off the call before the live agent handles it.
But if we know that they’ve already registered the product, well we can bypass all that together get them to the live agent. Mark, I also know that you’re working with companies along these lines attempting to actually be more predictive in nature by actually referencing customer behavioral data when they call.
Mark Landry: That’s right, Brian. Take Electrolux, for example, if the caller has called in within the last 24 hours or 48 hours, depending on how we set the timer, we can skip all the authentication. We don’t need to vet this person, it’s likely they’re calling for the same issue that they called about. They’re just calling for status on something that they’ve already called about. What we can do is dip into the data and the history of what they were doing 48 hours ago, when they called in and say, “Oh, hi, Mark, are you calling in about your product registration?” Because that’s what they were doing the last couple of days. “Oh yeah, I am.” “All right, great. Where we left off was,” blah, blah, blah, blah, blah. And just continue along that, be predictive, given the data that we know.
One of the other things we do is for power companies and such if someone’s calling in and we know they have a bill that to do or pass to do, we can just say, “Are you calling about your bill that’s due? And would like to take care of that.” Before they even go on to any other intent our clients appreciate us letting the customer know that something is due before we ask, “What else would you like help with today?” Because that is going to protect their revenue and also it’s convenient for the customer and they may not have known that they had to build. Some people don’t keep very good track of that stuff. These are some of the things we can do with data to leverage that data and create a flow and an arc for the conversation.
Brian Morin: So this is just a really quick call. We’ll go ahead and play this. This is for our a customer of ours, Designer Shoe Warehouse and get an idea of how they’re using their intelligent front door.
IVA: Thank you for calling DSW, Designer Shoe Warehouse, you’ve reached to Shoephoria. Am I speaking with Holly Sims?
IVA: To verify your account, please tell me the zip code listed on your account.
IVA: Great. I was able to find your account. What would you like to do today?
Customer: Address change. Address change.
IVA: Are you calling about order number 304?
Customer: I think so.
IVA: Do you want to change your billing address or your shipping address?
Customer: Shipping address.
IVA: What’s the five digit zip code of the new address?
IVA: Okay. From the zip code, I already knew your city and state. Please say just the first line of your shipping address, such as 1234 Main Street.
Customer: 7838 Martin Way East, Unit 17.
IVA: Now, I’ll confirm each part of your address. Was the house number 7838?
IVA: Was the street Martin Way.
Customer: Mm-hmm (affirmative). Right.
IVA: Was the apartment or unit number, number 17.
IVA: Thanks. I have your complete street address now. 78-
Brian Morin: Okay. We’ll go ahead and pause that there. Again, that’s not a fake call that’s a real call that came through DSW. And you can see how much further advanced a conversational AI is now, not just in the natural language back and forth, but the ability to handle, accents and understand accents, I think even better than I can. We’re short on time. We’re going to run through this last piece. We can hit our time posts and then we’ll move over to Q&A. If you haven’t gotten your questions, Sheri will start teeing those up here shortly. If you have any questions or comments, go ahead and start working on getting those in. Now we’ll give just a couple of quick examples. Marilyn, I’ll probably ask you to keep yours to 60 seconds.
Mark, I’ll probably ask you to keep yours to 60 seconds. We call this category of Triage Action. That can mean a lot of things you think of Triage as being a healthcare firm, but it’s a really good description of what a lot of what AI does, for across a number of different interactions. And you can see on screen, all of these are very different interactions, roadside assistance, reservation, claim approvals, even though that they’re very different types of interactions, the behavior of them is very simple. It is primarily information gathering, asking a number of questions, gathering that information to input that information either to a CRM or to pass along to a live agent or to complete a call.
And many of the transactional calls that we have this day and age with the AI is in this area of Triage. Marilyn, maybe give us kind of 60 second top down with AAA and then Mark, I’ll have you give a 60 second top down with Choice because the one with Choice … With AAA is the virtual agent handling it. And in the case of Choice, it’s more of a symbiotic relationship between the AI and the human. So, Marilyn.
Marilyn Cassedy: For sure. Yeah. With AAA, the key here on the Triage side is let’s try to understand what’s wrong with your vehicle and let’s try to understand what type of vehicle you have. Those are the sort of two key factors to keep in mind when AAA just thinks about who to dispatch that to. You would have-
Brian Morin: And where are you located too, right?
Marilyn Cassedy: Exactly. Yeah. No. And your location as well. That’s a great point. The idea being that if you’re at home with your Honda Civic and your battery isn’t working, well they only need to get somebody out with a jumpstart to you. Whereas if you’re on the side of the road with your pickup truck and you have a flat tire, that might be a very, very different scenario. And so the Triage part for AAA is understanding the specific scenario, where you are, what’s going on. And then the action is basically us sending that data back to AAA, so a truck can be dispatched to you. I’d keep going and talk a little bit more about some of the continuing conversation between the customer, the tow truck driver and the AI, but this is really a holistic experience that can take place almost entirely through our automated system. But I think I only got 60 seconds, so I’ll pass it on.
Brian Morin: Yeah. Don’t you go step over that line, Marilyn. All right. We can cover more in Q&A for those listening and would like to know a little bit more because there is a lot of complexity behind that. Mark, you work closely with Choice Hotels, the second largest hotel franchiser in the world with nearly every budget brand that rolls up to them. Maybe you can share just a little bit of what AI is doing there in the area of triage and action.
Mark Landry: Sure, Brian. With Choice Hotels, brands, and other hospitality clients that we work with, we’ve had similar conversations among all these clients, which is they really are invested in their live agents closing a sale and doing an upsell for a booking. They want these calls, a new reservation call to end with a live agent. They want that live agent to have a human touch with the customer, they want them to upsell a better room, a package, something like that. What we’ll do is we’ll say, “Okay, to cut down the agent’s time, we will number one, vet these callers to make sure they’re actually serious about making a reservation. We will take their destination, we’ll take their dates, when are you checking-in, when you’re checking out, how many people are traveling and then send that on to the agent.”
And if also in the case of Choice, they have sort of a VIP program that has different tiers to it. We’ll also, if we can identify the person as a VIP member, right from the get-go, like from their phone number that they’re calling in on, we will treat them with a little bit more white glove treatment as we go through that conversation and then hand it off to a special VIP reservation’s person.
Brian Morin: Excellent. Thanks, Mark. Just here, as far as closing out, if by chance you’re on the call, you’re kicking the tires on conversational AI, you can see an email on screen. If you like to engage with engages with next steps, usually one or two places requesting a demo, which is if you’d like to try it yourself, we’ll put you in the driver’s seat of trying a demo and see how well the system understands you and how well you can interact with it. On the flip side is understanding, well, what are the thresholds for change? And that’s getting a free consultation where we can examine the characteristics of your call tag, your chat types, find out what is perfect for automation, what meets expected volumes expected behavior that would be perfect candidates for AI and what would be an expected ROI, enabled to put that together quite easily. Sheri, with that said, I think that we covered everything on our side. We’ll throw it over to you for some Q&A.
Sheri Greenhaus: Okay, good. And there is a bunch of questions that came in and the very first question, and I was actually thinking this, probably just as this person was typing it, that the first demo that you did, Brian, what percentage of the calls get past the hello? And the person that wrote it in, “Speaking for myself, I immediately hang up upon hearing a non-person timbre of a voice.” This person is my spirit animal because I do the same exact thing. As soon as I hear it’s not a person, I just click, I don’t even wait to. How do you get through that? How do you get past that?
Brian Morin: Well, so one thing I could … I don’t have those hard figures in front of me. We obviously you’ll have that hang up experience as well. However, it’s not nearly as bad as a hangup experience with a voice that doesn’t sound human sounding. The worst thing that you can ever have pick that up and it doesn’t sound human-like. If it doesn’t sound human-like, do you have any confidence in that system to have any intelligence to work with you? So, it’s so funny how even the sound of a voice can play into that. If we can demonstrate, it’s kind of like what we showed earlier, something that sounds like human, something that immediately expresses some sort of intelligence about you, the caller is much more inclined to engage with the system. In the case of Brightree, I don’t have those hard figures. Someone’s interested in collections, we can certainly share that with them, but they have a really, really fantastic ROI. And we do have a case study on our website that should have all of the specifics.
Mark Landry: It also helps if we are allowed to greet the caller by name, if our client allows us to do that. For security purposes, some of them don’t, but if we are able to say, “Oh, welcome back, Brian,” in a human voice, you’d be amazed just that little touch that personalizes the experience and brings an intelligence that this thing already knows who I am to the experience, a lot of callers will give it a chance. And then the more it proves itself, the more they’re going to relax and just follow the conversation.
Sheri Greenhaus: Has anybody tried things like, “Don’t hang up, calling about your account on,” just kind of even doing a little humor or something to kind of peak their interest because you know they might, because that’s what I do as soon as I hear it. I have missed important calls because of it, to treat it a little differently, maybe a little humor. Because you know somebody might approach it that way or say, “This is about your account.” Something that immediately will grab their attention.
Marilyn Cassedy: These are pieces of it that I’ll start with Mark and then pass one of them to you. The first one is that I think you have to be aware, and I sort of kept saying this, you have to be aware of whether you’re going inbound or outbound. Without them, particularly that human feel and then getting straight to the point, as you said, makes a huge difference. With inbounds, we’ve definitely had customers try sort of a, “Hey, give me a chance,” type messaging with some success. It’s all about knowing your customer, is really what it comes down to. And we’re fortunate that generally speaking, the teams that we’re working with to build these out are very knowledgeable about their customer and we can bring our knowledge of voice systems and that can be really productive. Mark, I think you were probably going to talk more about the scripting though.
Mark Landry: Yeah. The inbound versus outbound, Marilyn’s a great point because like Sheri said, if you get a call and it’s a robot voice, you’re way more likely to hang up than if you’re calling into something and it’s a robot voice. If it’s an outbound call, we really have to script it to identify the company that’s calling right away. “Hi, Mark. This is Electrolux calling about your dryer.” As a caller or as a customer, if I hear this information is very specific to me, then I’m less likely to say, “Oh, this is a general sales call,” and hang up.
Brian Morin: Yeah. But just, Marilyn and her team, one of the big things that they do is any change that we are making or intending to make, to try to increase your containment or conversions at a certain waterfall, there’s always a process of AB testing that we do in all of those instances because ultimately it’s the data that matters.
Sheri Greenhaus: Okay. Let’s see. Have you seen AI use for event registration for an organization that’s nationwide? I think Marilyn, that’s kind of like the hotels, right, any reservations?
Marilyn Cassedy: Yeah. I think that there’s a lot of parallels between reservations and event registration. In both cases, without knowing more about the business case specifically, I’d say that these are sales opportunities. And a big part of what we do is on the one hand qualify your sales lead. They said they want a new reservation either to your event or your hotel, but can they give you dates? Do they know where they’re going? These kind of basic questions make sure that we’re only getting really qualified leads to a sales agent who might be able to upsell them into a bigger room or a bigger package, so that there’s multiple attendees coming with them, something like that. That qualification process is a pretty big part of what we do in the reservation area, just to make sure your agents are speaking to the best qualified callers.
Sheri Greenhaus: Okay.
Mark Landry: And we are able to take credit cards and EFT payments over the phone as well. And one thing that I don’t know if it’s been said yet, during COVID now a lot of live agents are working from home and it’s a real security risk for them to be taking credit card information over the phone. We’ve been asked by some clients and new clients to sort of scramble and give them a credit card solution right away that a live agent can either transfer to or we can start off the conversation and contain it fully within the virtual assistant.
Sheri Greenhaus: Okay. The DSW where they change the address, is there a way, because it could be talking about Mark’s scam calls, somebody gets ahold of something and they want that product shipped to an address that doesn’t really belong to that person. Is there a way to make sure, maybe by asking other pieces of information that that is the person that purchased the product and the address is correct to be changed?
Mark Landry: Yeah, absolutely. The first thing we’ll do in the case for DSW is that we know the phone number that’s calling in. If we can identify that phone number, we can dip into DSW’s database and know that person’s name. “Hi, Mark. Are you calling about your order? And I just need to ask you a question or two for security purpose.” And that will be anything from street address to just a zip code to we’ll have these conversations with our clients as to how strict. Some of our medical clients need to have HIPAA compliance. It’s three very unique identifiers; phone number, yearly birth date and the last four social security number. We’ll do an authentication process before we’re doing a transaction. Some use cases don’t need authentication like, “Oh, what are your operating hours?” We can just recite those.
Brian Morin: You can say in that case that we did play, we did do an ANI lookup on her phone number, did the data dip into DSW’s database match that with her and then did the authentication. So, that was a secure transaction.
Sheri Greenhaus: Are you typically going into the company’s database? Is the information in your system? How does that work?
Brian Morin: Yeah. You nailed it, Sheri. We are doing an in API lookup. What we’re doing is, as soon as we can … Let me give you an example. In that ANI lookup, we’re doing that ANI look up, the moment that we recognize the phone number we’re matching, where that might be with a customer record. We’re doing a single API dip to pull all of that data onto our system so we can have an intelligent conversation with that caller. Now it happens without any latency at all. In fact, we’re doing that data dip and having all that data at hand, without any visible latency to the caller. If we don’t have access to that data, well AI is fueled by data, so could you imagine your live agents doing a call without that data? It’s the same way with AI.
Mark Landry: Yeah. We just lease the data for the period of the call. We don’t keep it forever, as soon as the call’s done, those data leave our system for security reasons.
Sheri Greenhaus: So is then, is your solution a SaaS solution? Is it on-premise? How does that work?
Brian Morin: Sure. So, it is a SaaS solution, but instead of selling software licenses or seats to it, where we would expect the end user to use this as their own Lego blocks for their own do-it-yourself platform, we actually provide all the services needed, all the CX services needed to run that SaaS platform. Simply because in order for us to do your RCX process well, we have eight different teams across eight different disparate CX disciplines in order to get that experience right. And so since there is so much orchestration between different experts in their field, we have found for us that at least when it comes to voice, it is better to deliver that experience through a service, as opposed to just platform. Now, if somebody is only doing chat, and there is no voice involved, you can kind of take that DIY, platform approach, but with voice involved, that’s different.
Sheri Greenhaus: Okay. Let’s see. Our next question, it’s referencing if there’s multiple people in a household using the same phone. I guess it goes through authentication, kind of like that first call that asks for that person. And then I’m assuming qualifying to make sure you’re getting the right person that you want to talk to.
Mark Landry: Sure. Like for AAA, for example there will be multiple names listed on an account that’s tied to a single phone number. If I call in, I’m stuck on the side of the road and my wife, Robin and I have the same AAA account, it will then instead of saying, “Oh, hi, Mark.” It’ll say, “Oh, may I ask who’s calling?” And I’ll say, “It’s Mark.” So, it will then address me as Mark for the rest of the call.
Sheri Greenhaus: Okay.
Mark Landry: I’m sorry. It will know to listen for Mark or Robin. If I say, “It’s George.” It’ll say, “Sorry, can you give me …” It’ll ask a few more security questions because it doesn’t know George.
Sheri Greenhaus: Okay. Let’s see. There’s a bunch of questions coming in. Just the question we hit at the top, so maybe we can even explore this a little further is, “How do you make sure it’s not a scam call that somebody is interested in?” And I think Marilyn, you were talking a little bit about, it’s in the scripting. It’s in working with the customer and saying, “What can we do to make sure that when the person reaches that phone, they’re comfortable that it’s not a scam call?”
Marilyn Cassedy: Yeah. A really good example of this is some scripting changes that we did for basically a nationwide chain of optometrists offices. And we’re doing an outbound call to patients who had missed a recent appointment. The initial script that they requested, and they’re the customer, so we did ultimately build this out initially, had a piece of dynamic data that we couldn’t get into our voice very easily as about the first thing that the box said. It’d say, “Hi, I’m calling from Dr. Smith’s office.” But it wasn’t just Dr. Smith, it could be any number of different doctor’s names. And so by having a piece of dynamic data right up at the front like that, that couldn’t be easily recorded, we saw our engagement rate was really, really low. When we just changed the scripting at the beginning of that prompt to reflect it just, “Hi,” the name of the person we’re calling.
“We’re calling from your optometrist’s office.” Without even saying the name of the practice, we’re able to increase engagement rates just because there was a little bit less of that sort of robot sounding voice that they were hearing right at the time of the call.
Sheri Greenhaus: I’m assuming Marilyn, with the thousands of calls that you’ve heard and listened to, you can advise people on best ways, best practices, how to start these calls.
Marilyn Cassedy: Absolutely, but it’s definitely a collaborative exercise. We have the knowledge about how to do an effective voice implementation, but every business we work with also has their own requirements. And sometimes we’re at loggerheads and we would go with what the customer wants to see. And we circle back around later to say, “Okay, maybe we should have approached this another way.” We’re on a journey with these customers. It’s not as though the first thing out the gate is the end-all be-all. We were open to iterating and making sure that what we build and what we work on together is going to work for you and your customers.
Sheri Greenhaus: Okay. If there’s a series of calls and you can call and call back, does it make sense to work with some sort of speech recognition company also, so you know immediately this is the right person, that this is the wrong person as one step?
Marilyn Cassedy: That’s a great question. We haven’t done a lot of work with that specific technology. We’ve seen more of that in the sort of banking space, if I’m thinking correctly of like, sort of voice passwords and voice signatures. Yeah. It’s not something that we’ve done a ton of work with at this stage, but Mark, is that something that Product is probably flooring, I would say for future implementations?
Mark Landry: We are taking a look at that, yes. And it’s less of it’s less helpful over the eight kilohertz phone line than it would be over, let’s say a 48 kilohertz. We have our TC straight into –
Brian Morin: Connected their like Google home device or Alexa device at home for them.
Mark Landry: That’s right. I don’t know how much people know about the telephone line, but if your pure voice is a flat line, the telephone lines take the top and bottom range of your frequencies out, so that lots of people end up sounding very similar. And this is why even live agents have trouble understanding what people are saying over the phone. It’s not a good signal in terms of its quality. But our product team is looking at okay, in the future when telephone lines aren’t a thing, we’re having a conversation right now over something that’s not a telephone line at all. So, we have a better voice signal coming to us over this WebEx than we would over a phone. Could we use voiceprint identification here? That would be a more applicable sort of use case. Yeah.
Marilyn Cassedy: Okay. Have you done any customer satisfaction surveys, or do you know of any surveys between talking to a live agent and talking to a virtual agent? Has any of your clients looked into that?
Brian Morin: I would say that was probably at the top of the list for our clients. At the top of the list for our clients is looking at two things. It’s looking at, what is the ROI and then what is the CX against our live agents? Now, one thing that you noticed our Gartner Peer Insight rating that we mentioned earlier, we are the top rated solution at a 4.8 out of five. One of the reasons for that has to do with our own philosophy as a company, is that we refuse to automate something, unless we feel we can deliver an experience that is as good or better than a live agent. We are very careful in what we choose to automate or what swim lanes within an interaction that we choose to automate. You can go to our website.
We do have 18 published case studies, and I believe all of them site C-SATs that are as good or better than they’re live agents. For us, it’s something that we pay very, very close attention to. Another really good example of that is AAA. AAA is a top 25 most trusted brand. They built that top 25 most trusted brand on their human interaction with emergency roadside assistance. They built that trusted brand from what they were delivering through their human customer service representatives. They walked that bridge with us to take that interaction away from their humans and deliver that 100% to AI. And they still maintained their top 25 most trusted brand status. And they still have the same C-SAT scores with AI as they did with live agents. It’s a really good example of an organization where nothing was more important to them than C-SAT. And we were able to deliver for them
Mark Landry: Another area where humans are more likely to embrace a machine by having a conversation with them than another human being would be sensitive medical information and collections. Anything that would cause a sort of embarrassment feeling or exposure or vulnerability to the caller, they’re more likely to converse and share those details with a machine that can’t possibly have any judgment on them.
Marilyn Cassedy: How easy is it? So, you’re on with a virtual agent and you really need to talk to someone, how easy is it to then get over to alive agent?
Brian Morin: Yeah. Really good question, Sheri. Marilyn, do you want to handle that?
Marilyn Cassedy: Yeah, sure. We’re definitely not a believer in the concept of IVR Jail where you just can’t get to a live agent. We will ultimately know that the tough priority is going to be customer experience. If you feel that you need to speak to an agent, we’re going to get you to one. Each of our clients have different sort of level of tolerance for how easy it is. In most cases, I will say, you just press zero. And all of a sudden, you’re in line to speak to someone at the call center. That said, there are others who will sort of add a reminder or one last sort of opportunity to try to keep people in the system and just say like, “Hey, look, the order status I just provided you is going to be the same thing the agent tells you. I’m happy to get you to someone to talk about it, but I promise I’m reading off the same script as them basically.”
Both the agent and the AI are accessing the same API, the same order information. And so if a customer wants us to try to contain the call by providing that detail, we absolutely can. It’s like a lot of things. It depends, and there’s a range of options to fit every organization.
Brian Morin: In fact, Sheri, we will be listening for any word that indicates any type of desire for transfer. Say the word operators, say the word representative, say the word agent. In fact, if they just say the words, “I want to talk.”
Sheri Greenhaus: Okay. Do you find your customers are using this more inbound, outbound mixture?
Brian Morin: Yeah. I can answer that pretty simply. We see that 30% of our business is outbound. The other 70% is inbound.
Sheri Greenhaus: Okay. And how long does it actually take to get started? Somebody’s completely green here, they don’t have anything, how do you start the process?
Brian Morin: Yeah. Well, Marilyn, you and your team, you do a lot of hand-holding in that area.
Marilyn Cassedy: Yeah, absolutely. And I would say that there’s really a journey that we go on to get someone up and running, starting by designing the solution. The solution we designed goes back to the respective development teams. In many cases, our client development teams will already have an API available that we can access and that’ll really speed things up for the whole process. And so that our team can basically take that API back, build an application that leverages the information from it. And then we go through a period of QA UAT testing and finally to release. Our development period can be as quick as six to eight weeks, depending on what data is currently available. And that would be following the design and with probably some testing on top of that, but it really is fairly quick for the robustness of the solution.
Sheri Greenhaus: And how often do you recommend, once it’s up and running, that you need to tune that, what’s going on?
Brian Morin: Yeah. Again, Marilyn that falls with your team.
Marilyn Cassedy: Yeah, for sure. The tuning process really begins at go-live. During the design phase that I mentioned, we’ve gone through and tried to think of every scenario that we would expect a customer to encounter and to handle it appropriately within the design, but customers as I bet everyone on this call is well aware. They’re going to be unpredictable sometimes and immediately after go live, we’re going to learn a lot from your customers and how they interact with the system. And really go through a period of almost hypercare where we’re doing pretty intensive tuning for really a month up to six weeks, making sure that we have a couple of quick releases to address any issues that may come up and resolve them as quickly as possible.
Once we have the app in a really stable, steady state, based on that information from interacting with customers, we’ll probably roll into more like a quarterly or semi-annual release schedule with more minor changes, tweaks, updates, just as the client business changes and as we have new enhancements, we can add into your system.
Sheri Greenhaus: Okay. We’re getting questions on comparing and contrasting, what you do with some competitors. We typically don’t get into that. I just wanted you to know we’re not ignoring your question. We don’t get into that in the public forum. Best thing to do is to contact SmartAction directly and there’s information there. We also will pass over the information from the Q&A transcript so they can contact you with that information as well. As we wrap up, I’m going to ask each one of you in, well, you can take 30 seconds starting with Mark, final thoughts.
Mark Landry: Sure. Just I want to impress upon people that a conversation that’s automated doesn’t need to have emotional intelligence, which doesn’t exist. So, there’s AI, but there isn’t automated emotional intelligence, but we can infuse those designs of those conversations with our own human, emotional intelligence and have that tension release and that often will have the result of a contained call.
Sheri Greenhaus: Marilyn.
Marilyn Cassedy: Yeah. I think I just with the ideas that I think that partnering with a company like SmartAction is a great way to bring a lot of expertise to your organization relatively quickly and relatively inexpensively compared to building out your own team. I think that the idea of being able to leverage the knowledge and the information that sits within our four walls or for remote offices or wherever we are now, it can be really powerful in terms of how you engage with your customers. I’m looking forward to hearing from folks.
Sheri Greenhaus: Definitely. And Brian…
Brian Morin: I think they encapsulated well. No final remarks for me, just that we appreciate everybody’s time and look forward to an ongoing conversation.
Sheri Greenhaus: Okay, great. And finish with one minute to spare. With that one minute, I’m just going to remind everyone that the recording will be available in about 24 hours. We will send you the link to the recording and you’ll have a PDF of the deck. If you’d like to get in touch with SmartAction, there’s information right there on your screen. And thank you. Thank you all for attending. Thank you to the panel at SmartAction. I think there was a lot of interesting information. Certainly, got my mind going about the different types of applications that could be used that would reduce my frustration. Thank you all for attending.
Brian Morin: Thank you, Sheri.
Mark Landry: Thank you, Sheri.
Sheri Greenhaus: And at this point we will be closing with webcasts. Have a wonderful rest of your day. Bye-bye.
Mark Landry: Bye.
Brian Morin: Thank you.