All posts by Marius Cealera

Field Notes: Build Dynamic IVR Menus with Amazon Connect and AWS Lambda

Post Syndicated from Marius Cealera original https://aws.amazon.com/blogs/architecture/field-notes-build-dynamic-ivr-menus-with-amazon-connect-and-aws-lambda/

This post was co-written by Marius Cealera, Senior Partner Solutions Architect at AWS, and Zdenko Estok, Cloud Architect and DevOps Engineer at Accenture. 

Modern interactive voice response (IVR) systems help customers find answers to their questions through a series of menus, usually relying on the customer to filter and select the right options. Adding more options in these IVR menus and hoping to increase the rate of self-serviced calls can be tempting, but it can also overwhelm customers and lead them to ‘zeroing out’. That is, they select to be transferred to a human agent, defeating the purpose of a self-service solution.

This post provides a technical overview of one of Accenture’s Advanced Customer Engagement (ACE+) solutions, explaining how to build a dynamic IVR menu in Amazon Connect, in combination with AWS Lambda and Amazon DynamoDB. The solution can help the zeroing out problem by providing each customer with a personalized list of menu options. Solutions architects, developers, and contact center administrators will learn how to use Lambda and DynamoDB to build Amazon Connect flows where menu options are customized for every known customer. We have provided code examples to deploy a similar solution.

Overview of solution

“Imagine a situation where a customer navigating a call center IVR needs to choose from many menu options – for example, an insurance company providing a self-service line for customers to check their policies. For dozens of insurance policy variations, the IVR would need a long and complex menu. Chances are that after the third or fourth menu choice the customers will be confused, irritated or may even forget what they are looking for,” says Zdenko Estok, Cloud Architect and Amazon Connect Specialist at Accenture.

“Even splitting the menu in submenus does not completely solve the problem. It is a step in the right direction as it can reduce the total time spent listening to menu options, but it has the potential to grow into a huge tree of choices.”

Figure 1. A static one-layer menu on the left, a three-layer menu on the right

Figure 1. A static one-layer menu on the left, a three-layer menu on the right

One way to solve this issue is by presenting only relevant menu options to customers. This approach significantly reduces the time spent in the IVR menu, leading to a better customer experience. This also minimizes the chance the customer will require transfer to a human agent.

Figure 2. Selectively changing the IVR menu structure based on customer profile

Figure 2. Selectively changing the IVR menu structure based on customer profile

For use cases with a limited number of menu options, the solution can be achieved directly in the Amazon Connect IVR designer through the use of the Check Contact Attributes block. However, this approach can lead to complex and hard to maintain flows for situations where dozens of menu variations are possible. A more scalable solution is to store customer information and menu options in DynamoDB and build the menu dynamically by using a series of Lambda functions (Figure 3).

Consider a customer with the following information stored in a database: phone number, name, and active insurance policies. A dynamic menu implementation will authenticate the user based on the phone number, retrieve the policy information from the database, and build the menu options.

The first Lambda function retrieves the customer’s active policies and builds the personalized greeting and menu selection prompt. The second Lambda function maps the customer’s menu selection to the correct menu path. This is required since the menu is dynamic and the items and ordering are different for different customers. This approach also allows administrators to add or change insurance types and their details, directly in the database, without the need to update the IVR structure. This can be useful when maintaining IVR flows for dozens of products or services.

Figure 3 - IVR flow leveraging dynamically generated menu options.

Figure 3 – IVR flow leveraging dynamically generated menu options.

Walkthrough

Sample code for this solution is provided in this GitHub repo. The code is packaged as a CDK application, allowing the solution to be deployed in minutes. The deployment tasks are as follows:

  1. Deploy the CDK app.
  2. Update Amazon Connect instance settings.
  3. Import the demo flow and data.

Prerequisites

For this walkthrough, you need the following prerequisites:

  • An AWS account.
  • AWS CLI access to the AWS account where you would like to deploy your solution.
  • An Amazon Connect instance. If you do not have an Amazon Connect instance, you can deploy one and claim a phone number with Set up your Amazon Connect instance.

Deploy the CDK application

The resources required for this demo are packaged as a CDK app. Before proceeding, confirm you have CLI access to the AWS account where you would like to deploy your solution.

  1. Open a terminal window and clone the GitHub repository in a directory of your choice:

git clone [email protected]:aws-samples/amazon-connect-dynamic-ivr-menus.git

Navigate to the cdk-app directory and follow the deployment instructions. The default region is usually us-east-1. If you would like to deploy in another Region, you can run:

export AWS_DEFAULT_REGION=eu-central-1

Update Amazon Connect instance settings

You need to update your Amazon Connect instance settings to implement the Lambda functions created by the CDK app.

  1. Log into the AWS console.
  2. Navigate to Services > Amazon Connect. Select your Amazon Connect instance.
  3. Select Contact Flows.
  4. Scroll down to the Lambda section and add getCustomerDetails* and selectionFulfi lment* functions. If the Lambda functions are not listed, return to the Deploy the CDK application section and verify there are no deployment errors.
  5. Select +Add Lambda function.

Import the demo flow

  1. Download the DemoMenu Amazon Connect flow from the flow_archive section of the sample code repository.
  2. Log in to the Amazon Connect console. You can find the Amazon Connect access url for your instance in the AWS Console, under Services > Amazon Connect > (Your Instance Name). The access url will have the following format: https://<your_instance_name>.awsapps.com/connect/login
  3.  Create a new contact flow by selecting ‘Contact Flows’ from the left side menu and then select Create New Contact Flow.
  4.  Select ‘Import Flow(beta)’ from the upper right corner menu and select the DemoMenu file downloaded at step 1.
  5.  Click on the first ‘Invoke Lambda’ block, and verify the getCustomerDetails* Lambda is selected.
  6.  Select the second Invoke Lambda block, and verify the selectionFulfilment*’Lambda is selected.
  7.  Select Publish.
  8.  Associate the new flow with your claimed phone number (phone numbers are listed in the left side menu).

Update the demo data and test

  1. For the demo to work and recognize your phone number, you will need to enter your phone number into the demo customers table.
  2. Navigate to the AWS console and select DynamoDB.
  3. From the left hand side menu select Tables, open the CdkAppStack-policiesDb*table, and navigate to the Items tab. If the table is empty, verify you started the populateDBLamba, as mentioned in the CDK deployment instructions.
  4. Select one of the customers in the table, then select Actions > Duplicate. In the new item, enter your phone number (in international format).
  5. Select Save.
  6. Dial your claimed Connect number. You should hear the menu options based on your database table entry.

Clean up

You can remove all resources provisioned for the CDK app by navigating to the cdk-app directory and running the following command:

cdk destroy

This will not remove your Amazon Connect instance. You can remove it by navigating to the AWS console > Services > Amazon Connect. Find your Connect instance and select Remove.

Conclusion

In this post we showed you how a dynamic IVR menu can be implemented in Amazon Connect. Using a dynamic menu can significantly reduce call durations by helping customers reach relevant content faster in the IVR system, which often leads to improved customer satisfaction. Furthermore, this approach to building IVR menus provides call center administrators with a way to manage menus with dozens or hundreds of branches directly in a backend database, as well as add or update menu options.

Field Notes provides hands-on technical guidance from AWS Solutions Architects, consultants, and technical account managers, based on their experiences in the field solving real-world business problems for customers.

Accelerating Innovation with the Accenture AWS Business Group (AABG)

By working with the Accenture AWS Business Group (AABG), you can learn from the resources, technical expertise, and industry knowledge of two leading innovators, helping you accelerate the pace of innovation to deliver disruptive products and services. The AABG helps customers ideate and innovate cloud solutions with customers through rapid prototype development.

Connect with our team at [email protected] to learn how to use machine learning in your products and services.

 

Zdenko Estok

Zdenko Estok

Zdenko Estok works as a Cloud Architect and DevOps engineer at Accenture. He works with AABG to develop and implement innovative cloud solutions, and specializes in Infrastructure as Code and Cloud Security. Zdenko likes to bike to the office and enjoys pleasant walks in nature.

Field Notes: Improving Call Center Experiences with Iterative Bot Training Using Amazon Connect and Amazon Lex

Post Syndicated from Marius Cealera original https://aws.amazon.com/blogs/architecture/field-notes-improving-call-center-experiences-with-iterative-bot-training-using-amazon-connect-and-amazon-lex/

This post was co-written by Abdullah Sahin, senior technology architect at Accenture, and Muhammad Qasim, software engineer at Accenture. 

Organizations deploying call-center chat bots are interested in evolving their solutions continuously, in response to changing customer demands. When developing a smart chat bot, some requests can be predicted (for example following a new product launch or a new marketing campaign). There are however instances where this is not possible (following market shifts, natural disasters, etc.)

While voice and chat bots are becoming more and more ubiquitous, keeping the bots up-to-date with the ever-changing demands remains a challenge.  It is clear that a build>deploy>forget approach quickly leads to outdated AI that lacks the ability to adapt to dynamic customer requirements.

Call-center solutions which create ongoing feedback mechanisms between incoming calls or chat messages and the chatbot’s AI, allow for a programmatic approach to predicting and catering to a customer’s intent.

This is achieved by doing the following:

  • applying natural language processing (NLP) on conversation data
  • extracting relevant missed intents,
  • automating the bot update process
  • inserting human feedback at key stages in the process.

This post provides a technical overview of one of Accenture’s Advanced Customer Engagement (ACE+) solutions, explaining how it integrates multiple AWS services to continuously and quickly improve chatbots and stay ahead of customer demands.

Call center solution architects and administrators can use this architecture as a starting point for an iterative bot improvement solution. The goal is to lead to an increase in call deflection and drive better customer experiences.

Overview of Solution

The goal of the solution is to extract missed intents and utterances from a conversation and present them to the call center agent at the end of a conversation, as part of the after-work flow. A simple UI interface was designed for the agent to select the most relevant missed phrases and forward them to an Analytics/Operations Team for final approval.

Figure 1 – Architecture Diagram

Amazon Connect serves as the contact center platform and handles incoming calls, manages the IVR flows and the escalations to the human agent. Amazon Connect is also used to gather call metadata, call analytics and handle call center user management. It is the platform from which other AWS services are called: Amazon Lex, Amazon DynamoDB and AWS Lambda.

Lex is the AI service used to build the bot. Lambda serves as the main integration tool and is used to push bot transcripts to DynamoDB, deploy updates to Lex and to populate the agent dashboard which is used to flag relevant intents missed by the bot. A generic CRM app is used to integrate the agent client and provide a single, integrated, dashboard. For example, this addition to the agent’s UI, used to review intents, could be implemented as a custom page in Salesforce (Figure 2).

Figure 2 – Agent feedback dashboard in Salesforce. The section allows the agent to select parts of the conversation that should be captured as intents by the bot.

A separate, stand-alone, dashboard is used by an Analytics and Operations Team to approve the new intents, which triggers the bot update process.

Walkthrough

The typical use case for this solution (Figure 4) shows how missing intents in the bot configuration are captured from customer conversations. These intents are then validated and used to automatically build and deploy an updated version of a chatbot. During the process, the following steps are performed:

  1. Customer intents that were missed by the chatbot are automatically highlighted in the conversation
  2. The agent performs a review of the transcript and selects the missed intents that are relevant.
  3. The selected intents are sent to an Analytics/Ops Team for final approval.
  4. The operations team validates the new intents and starts the chatbot rebuild process.

Figure 3 – Use case: the bot is unable to resolve the first call (bottom flow). Post-call analysis results in a new version of the bot being built and deployed. The new bot is able to handle the issue in subsequent calls (top flow)

During the first call (bottom flow) the bot fails to fulfil the request and the customer is escalated to a Live Agent. The agent resolves the query and, post call, analyzes the transcript between the chatbot and the customer, identifies conversation parts that the chatbot should have understood and sends a ‘missed intent/utterance’ report to the Analytics/Ops Team. The team approves and triggers the process that updates the bot.

For the second call, the customer asks the same question. This time, the (trained) bot is able to answer the query and end the conversation.

Ideally, the post-call analysis should be performed, at least in part, by the agent handling the call. Involving the agent in the process is critical for delivering quality results. Any given conversation can have multiple missed intents, some of them irrelevant when looking to generalize a customer’s question.

A call center agent is in the best position to judge what is or is not useful and mark the missed intents to be used for bot training. This is the important logical triage step. Of course, this will result in the occasional increase in the average handling time (AHT). This should be seen as a time investment with the potential to reduce future call times and increase deflection rates.

One alternative to this setup would be to have a dedicated analytics team review the conversations, offloading this work from the agent. This approach avoids the increase in AHT, but also introduces delay and, possibly, inaccuracies in the feedback loop.

The approval from the Analytics/Ops Team is a sign off on the agent’s work and trigger for the bot building process.

Prerequisites

The following section focuses on the sequence required to programmatically update intents in existing Lex bots. It assumes a Connect instance is configured and a Lex bot is already integrated with it. Navigate to this page for more information on adding Lex to your Connect flows.

It also does not cover the CRM application, where the conversation transcript is displayed and presented to the agent for intent selection.  The implementation details can vary significantly depending on the CRM solution used. Conceptually, most solutions will follow the architecture presented in Figure1: store the conversation data in a database (DynamoDB here) and expose it through an (API Gateway here) to be consumed by the CRM application.

Lex bot update sequence

The core logic for updating the bot is contained in a Lambda function that triggers the Lex update. This adds new utterances to an existing bot, builds it and then publishes a new version of the bot. The Lambda function is associated with an API Gateway endpoint which is called with the following body:

{
	“intent”: “INTENT_NAME”,
	“utterances”: [“UTTERANCE_TO_ADD_1”, “UTTERANCE_TO_ADD_2” …]
}

Steps to follow:

  1. The intent information is fetched from Lex using the getIntent API
  2. The existing utterances are combined with the new utterances and deduplicated.
  3. The intent information is updated with the new utterances
  4. The updated intent information is passed to the putIntent API to update the Lex Intent
  5. The bot information is fetched from Lex using the getBot API
  6. The intent version present within the bot information is updated with the new intent

Figure 4 – Representation of Lex Update Sequence

 

7. The update bot information is passed to the putBot API to update Lex and the processBehavior is set to “BUILD” to trigger a build. The following code snippet shows how this would be done in JavaScript:

const updateBot = await lexModel
    .putBot({
        ...bot,
        processBehavior: "BUILD"
    })
    .promise()

9. The last step is to publish the bot, for this we fetch the bot alias information and then call the putBotAlias API.

const oldBotAlias = await lexModel
    .getBotAlias({
        name: config.botAlias,
        botName: updatedBot.name
    })
    .promise()

return lexModel
    .putBotAlias({
        name: config.botAlias,
        botName: updatedBot.name,
        botVersion: updatedBot.version,
        checksum: oldBotAlias.checksum,
}) 

Conclusion

In this post, we showed how a programmatic bot improvement process can be implemented around Amazon Lex and Amazon Connect.  Continuously improving call center bots is a fundamental requirement for increased customer satisfaction. The feedback loop, agent validation and automated bot deployment pipeline should be considered integral parts to any a chatbot implementation.

Finally, the concept of a feedback-loop is not specific to call-center chatbots. The idea of adding an iterative improvement process in the bot lifecycle can also be applied in other areas where chatbots are used.

Accelerating Innovation with the Accenture AWS Business Group (AABG)

By working with the Accenture AWS Business Group (AABG), you can learn from the resources, technical expertise, and industry knowledge of two leading innovators, helping you accelerate the pace of innovation to deliver disruptive products and services. The AABG helps customers ideate and innovate cloud solutions with customers through rapid prototype development.

Connect with our team at [email protected] to learn and accelerate how to use machine learning in your products and services.


Field Notes provides hands-on technical guidance from AWS Solutions Architects, consultants, and technical account managers, based on their experiences in the field solving real-world business problems for customers.

 

Abdullah Sahin

Abdullah Sahin

Abdullah Sahin is a senior technology architect at Accenture. He is leading a rapid prototyping team bringing the power of innovation on AWS to Accenture customers. He is a fan of CI/CD, containerization technologies and IoT.

Muhammad Qasim

Muhammad Qasin

Muhammad Qasim is a software engineer at Accenture and excels in development of voice bots using services such as Amazon Connect. In his spare time, he plays badminton and loves to go for a run.