Articles on: Conversation Intelligence

Leveraging AI to identify the tones/sentiments of the speakers present in the call

Get an overall idea of the conversation by glancing through the emojis in the timeline. Tones can help users/teams to understand the sentiments (happy, sad, friendly, angry, confident) of the customers and gain insights into their emotions



Tones would be available after the call for both “Fast“ and “In-depth Analysis“. This is how Outplay will show the Tones.

AI will try to predict tones for each transcription text of the speaker present in the call.

For a given sentence in the call transcription, there can only be a single tone mapped to it. AI will not predict multiple tones for a given sentence.

For each tone, their respective emojis will be shown

Happy - 😃

Sad - 😔

Friendly - 😊

Angry - 😠

Confident - 😏

The emojis will be shown on the timeline placed in sync with the position of the sentence in case of the paragraph. For example, if two tones “happy“ and “sad“ are identified at the start and middle of the paragraph then “😃“ will be placed at the start of the monologue bar, and “😔“ will placed at the end of the monologue bar. On clicking, the seek bar will be re-positioned to the sentence in the transcript tab for which the tone was predicted by AI.

Apart from the timeline, the emojis will also be shown next to their respective sentences in the transcript tab.


Updated on: 31/05/2024

Was this article helpful?

Share your feedback

Cancel

Thank you!