Conversational Insights provides contact center interaction data to answer business questions or support decisions to drive efficiency. It uses Google’s ML and Large Language Models to turn contact center data to insights to action. Based on the outcomes of the insights module various actions can be taken by contact center management like agent coaching, improved bots, improved contact center workflows.
Objectives
In this lab, you will learn how to:
Ingest conversations into Conversational Insights
How to configure a Topic Model to classify conversations into various topics
Configure LLM Summarization module
Analyze conversations to find out information like sentiment, customer satisfaction, summary, resolution, highlights and more
Export conversations to BigQuery to Analyze further or to create customized dashboards
Setup and Requirements
Before you click the Start Lab button
Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.
This Qwiklabs hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.
What you need
To complete this lab, you need:
Access to a standard internet browser (Chrome browser recommended).
Time to complete the lab.
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab.
Note: If you are using a Pixelbook, open an Incognito window to run this lab.
Activate Cloud Shell
Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud. Cloud Shell provides command-line access to your Google Cloud resources.
In the Cloud Console, in the top right toolbar, click the Activate Cloud Shell button.
Click Continue.
It takes a few moments to provision and connect to the environment. When you are connected, you are already authenticated, and the project is set to your PROJECT_ID. For example:
gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.
You can list the active account name with this command:
Alternatively, if one of these APIs are not listed, click on the ENABLE APIS AND SERVICES button at the top of the screen, type in the name of the missing API into the search box, select the API from the search results, then click on the Enable button.
Navigate to Vertex AI > Workbench.
Select Instances and click on the Open JupyterLab button beside ccai-notebook. You may need to wait a few minutes while the instance is being provisioned.
Once the notebook is open, click on the Terminal icon from Launcher. This will open up a new Terminal window.
Ensure that the conversations are preloaded into the expected Google Cloud Storage location by running the following commands in your terminal window:
export PROJECT_ID="{{{project_0.project_id|project_id}}}"
gcloud storage ls gs://${PROJECT_ID}-ccai-insights-sample-transcripts
You can explore the conversations in the Cloud Console as well. Return to the Cloud Console and navigate to Cloud Storage > Buckets. Click on the pre-existing bucket with name: -ccai-insights-sample-transcripts.
Click on abcd_1.json. Click on the Download button to download the file. Open the file on your machine and ensure that its format matches the recommended conversation format for chat transcripts.
The JSON file should have fields like entries, text, role and user_id as shown below.
This validation is important before proceeding the rest of the steps. If the format of the JSON transcript is not correct, then it would require some pre-processing first. Since it is correct in the sample data of this lab, you can proceed further.
Task 2: Load conversations into Conversational Insights
Now return to your JupyterLab instance and open a new Terminal if you had closed the one from before. Enter following commands to start a Python virtual environment and install required packages.
This command will install all required packages mentioned within the requirements.txt file inside the notebook. Its output should look like this:
There are 2 files already created which are required to ingest data into Conversational Insights. These files are: import_conversations_v2.py and ingest_transcripts_from_gcs.sh.
import_conversations_v2.py contains the Python code required to perform Speech-to-Text (STT) if required (_TranscribeAsync function), Data Loss Prevention (DLP) redaction (_RedactTranscript function), Metadata Configuration (_GetMetaDataFromTranscription function) and to ingest conversations into Conversational Insights (_CreateInsightsConversation function). Feel free to go through these functions if you would like an in depth understanding of what code we are executing.
If you alter the code in the `import_conversations_v2.py` file, then the following steps may fail.
ingest_transcripts_from_gcs.sh is the wrapper file that takes configurable arguments from the user and executes the Python file mentioned above. We need to modify certain fields of this file to perform ingestion. Open the ingest_transcripts_from_gcs.sh file and update the source_chat_transcript_gcs_bucket and PROJECT_ID variables as shown below and then save the file:
export source_chat_transcript_gcs_bucket="{{{project_0.project_id|project_id}}}-ccai-insights-sample-transcripts"
export PROJECT_ID="{{{project_0.project_id|project_id}}}"
sudo sed -i 's/qwiklabs-gcp-02-709a56e331c9-ccai-insights-sample-transcripts/{{{project_0.project_id|project_id}}}-ccai-insights-sample-transcripts/g' ingest_transcripts_from_gcs.sh
sudo sed -i 's/qwiklabs-gcp-02-709a56e331c9/{{{project_0.project_id|project_id}}}/g' ingest_transcripts_from_gcs.sh
After these modifications, the script should look like this:
Open the terminal again and execute the following command, this will trigger the ingestion process:
This will start ingesting conversations and the results will be visible on terminal as shown below:
Since 100 conversations will be ingested during this step, it will take ~2-3 minutes for this step to be completed.
Navigate to Conversational Insights on a new tab. This will open up the Insights console. Once opened, select the project and global location as shown below.
You should be able to see 100 conversations ingested as shown below:
Explore all UI components including conversation filters. Click on any conversation to look at the transcript and ensure that PII data is masked properly as shown below.
Click Check my progress to verify the objective.
Load conversations into Conversational Insights
Task 3. Configure Topic Model
Navigate to Topic Model section in Insights console from the left bar and click on Create a new model
Input the following options for creating a new topic model:
Option
Input
Model Name
ccai-demo-model
Model training strategy
Topic Model V2
Model language
English
You can also apply any additional filter on the conversations like turn count, agent ID or even label based filters and click on Next and then Acknowledge and click on Start Training.
Note: This initiates the model training, however, since we have not ingested the minimum required training volume of 1000 conversations, the training will fail with a message like this:
The purpose of this task was to guide you through the topic model configuration. Training a topic model would take around 90 minutes if done with 1000 conversations.
Task 4. Configure LLM Summarization
Now click on the main menu on the Conversational Insights page and navigate to the Agent Assist console in a new tab. Alternatively, you can type in the URL https://agentassist.cloud.google.com/ in a new tab and select the current project.
Go to Conversation Profiles and click on Create. Provide the following values for the options:
Option
Input
Display name
ccai-demo-profile
Suggestion types
Conversation summarization (legacy)
Suggestion model type
Baseline Model
Baseline model version
2.0
Select all output sections (Situation, Action, Resolution, Customer satisfaction, Reason for cancellation and Entities). For more information on this, refer to the documentation
Leave the rest of the values as their defaults and click the Create button.
This will create the conversation profile. Copy the integration ID and make a note of it. This will be used in the analyze section later.
Click Check my progress to verify the objective.
Configure LLM Summarization
Task 5. Analyzing Conversations
You can now get started by analyzing the conversations. There are 2 ways to analyze conversations, Bulk analyze and analyzing individual conversations.
First navigate back to the Insights console. To bulk analyze the conversations from UI, we can click on Analyze button from the Conversation Hub directly as shown below:
Whatever filters you apply on this page will automatically propagate when you try to analyze conversations. For example, you can filter for conversations which are not analyzed yet and then only analyze those conversations as shown below. You can also select the percentage of the conversations that you want to analyze. This step will take about 5 minutes to complete. When the process completes, we will be able to see the completed analysis job in the notification bar on the top right corner of the Insights console.
Another way to analyze conversations is to open up a conversation and click on analyze separately. Select a conversation and then click the Analyze button.
Here, you have options to select what all things to include in analysis. Check the Conversational Insights Analysis and Summmary boxes, but leave the Topic Model labeling unchecked as you do not have a deployed Topic Model in your environment. Click the Analyze button to begin.
Once the conversation is analyzed, we can see the summary, sentiment analysis, entities and smart highlights of the conversation as shown below:
However, this does not use the conversation profile that we have created in the previous step, so the summary section contains only a single line summary and not sections that we had selected in conversation profile like customer satisfaction, resolution, situation, action, etc. To do this, we need to analyze the conversations through API. This is covered in the next step.
In order to analyze a conversation from API, copy the Conversation ID of any conversation from Insights Console.
Also copy the integration ID of the Conversation Profile created in the Agent Assist console. This is accessible at https://agentassist.cloud.google.com/projects//locations/global/conversation-profiles.
Navigate back to your Vertex AI Workbench instance on another tab. If you closed this tab, then return to console and navigate to Vertex AI > Workbench. Select Instances and click on the Open JupyterLab button beside ccai-notebook. Once the notebook is open, click on the Terminal icon from Launcher if a terminal is not already open.
In your open terminal run the following commands, replacing YOUR_CONVERSATION_ID with your Conversation ID and YOUR_INTEGRATION_ID with the Integration ID copied before.
Upon execution of command, operation id will be returned which can be polled to view the status of LLM Summarization.
However, this should not take long (1-3 seconds) and if we go back to the Conversational AI console and refresh the same conversation, we will be able to see the updated summary section containing LLM Summarization. It contains all the sections we had configured in the earlier task. You can notice how different the summary is compared to the previous section.
Click Check my progress to verify the objective.
Analyze conversations
Task 6. Exporting Conversations to BigQuery
We can also export conversations into BigQuery to perform any custom analysis on the results or even connect it with a dashboard tool like Looker.
Before we export conversations to BigQuery, we need to create an empty table within a dataset in the BigQuery console.
Return to the Cloud Console and navigate to BigQuery > BigQuery Studio and in the Explorer section, click on Create dataset within the current project as shown below:
Name the dataset ccai_demo_dataset. Select "Region" for Location type and select for Region. Click on Create Dataset
Once the dataset is created and visible on Explorer, click on Create Table as shown below:
Name the table ccai_demo_insights_table and click on Create Table without modifying anything else. This will create an empty table which can be used to export Insights data into.
Now, go back to the Insights console and from the conversation hub, click on Export. Note that any filter applied on the conversations automatically gets propagated to the export section as well.
Select the newly created Dataset and Table from the previous steps and click on Export. This will trigger a long running job.
This job can take several minutes to be completed, status can be seen from the notification bar on the top right of the insights console.
Return to the Cloud Console and navigate to BigQuery > BigQuery Studio In the Explorer, navigate under the project to the ccai_demo_dataset and the ccai_demo_insights_table. Click on Schema to take a look at the overall schema of the BigQuery Insights table.
Click on Preview to observe the Data in the table.
You can also query the conversation summaries in BigQuery. Click on Query and then click In New tab. Copy the following query into the Query Editor and click Run to execute the query.
SELECT
conversationName,
sen.sentence,
JSON_QUERY(JSON_VALUE(ant.annotationRecord), '$.text') AS summary_text
FROM
ccai_demo_dataset.ccai_demo_insights_table,
UNNEST(`sentences`) AS sen,
UNNEST(`annotations`) AS ant
WHERE ant.type = 'CONVERSATION_SUMMARIZATION_SUGGESTION'
Click Check my progress to verify the objective.
Export conversations to BigQery
Congratulations! You have ingested conversations into Conversational Insights, configured a Topic Model to classify conversations into various topics, configured a LLM summarization module, analyzed conversations to find out information like sentiment, customer satisfaction, summary, resolution, highlights and more. You also export conversations to BigQuery for further analysis or possible dashboarding.
End your lab
When you have completed your lab, click End Lab. Qwiklabs removes the resources you’ve used and cleans the account for you.
You will be given an opportunity to rate the lab experience. Select the applicable number of stars, type a comment, and then click Submit.
The number of stars indicates the following:
1 star = Very dissatisfied
2 stars = Dissatisfied
3 stars = Neutral
4 stars = Satisfied
5 stars = Very satisfied
You can close the dialog box if you don't want to provide feedback.
For feedback, suggestions, or corrections, please use the Support tab.
Moduły tworzą projekt Google Cloud i zasoby na określony czas.
Moduły mają ograniczenie czasowe i nie mają funkcji wstrzymywania. Jeśli zakończysz moduł, musisz go zacząć od początku.
Aby rozpocząć, w lewym górnym rogu ekranu kliknij Rozpocznij moduł.
Użyj przeglądania prywatnego
Skopiuj podaną nazwę użytkownika i hasło do modułu.
Kliknij Otwórz konsolę w trybie prywatnym.
Zaloguj się w konsoli
Zaloguj się z użyciem danych logowania do modułu. Użycie innych danych logowania może spowodować błędy lub naliczanie opłat.
Zaakceptuj warunki i pomiń stronę zasobów przywracania.
Nie klikaj Zakończ moduł, chyba że właśnie został przez Ciebie zakończony lub chcesz go uruchomić ponownie, ponieważ spowoduje to usunięcie wyników i projektu.
Ta treść jest obecnie niedostępna
Kiedy dostępność się zmieni, wyślemy Ci e-maila z powiadomieniem
Świetnie
Kiedy dostępność się zmieni, skontaktujemy się z Tobą e-mailem
Jeden moduł, a potem drugi
Potwierdź, aby zakończyć wszystkie istniejące moduły i rozpocząć ten
Aby uruchomić moduł, użyj przeglądania prywatnego
Uruchom ten moduł w oknie incognito lub przeglądania prywatnego. Dzięki temu unikniesz konfliktu między swoim kontem osobistym a kontem do nauki, co mogłoby spowodować naliczanie dodatkowych opłat na koncie osobistym.
Conversational Insights provides contact center interaction data to answer business questions or support decisions to drive efficiency. It uses Google ML and Large Language Models to turn contact center data to insights to action. Based on the outcomes of the insights module various actions can be taken by contact center management like agent coaching, improved bots, improved contact center workflows.
Czas trwania:
Konfiguracja: 3 min
·
Dostęp na 90 min
·
Ukończono w 90 min