Leveraging AI to identify the tones/sentiments of the speakers present in the call

Get an overall idea of the conversation by glancing through the emojis in the timeline. Tones can help users/teams to understand the sentiments (happy, sad, friendly, angry, confident) of the customers and gain insights into their emotions

  1. Tones would be available after the call for both “Fast“ and “In-depth Analysis“. This is how Outplay will show the Tones.

  2. AI will try to predict tones for each transcription text of the speaker present in the call.

  3. For a given sentence in the call transcription, there can only be a single tone mapped to it. AI will not predict multiple tones for a given sentence.

  4. For each tone, their respective emojis will be shown

    1. Happy - 😃

    2. Sad - 😔

    3. Friendly - 😊

    4. Angry - 😠

    5. Confident - 😏

  5. The emojis will be shown on the timeline placed in sync with the position of the sentence in case of the paragraph. For example, if two tones “happy“ and “sad“ are identified at the start and middle of the paragraph then “😃“ will be placed at the start of the monologue bar, and “😔“ will placed at the end of the monologue bar. On clicking, the seek bar will be re-positioned to the sentence in the transcript tab for which the tone was predicted by AI.

  6. Apart from the timeline, the emojis will also be shown next to their respective sentences in the transcript tab.