In This Article
About Automated Analysis
Automated analysis, also known as AI-powered coaching, creates value for both managers and the employees they supervise. Managers have limited bandwidth and cannot easily scale feedback and coaching when supervising large teams. But automated analysis helps managers identify, at a glance, the skills and practices on which their teammates need coaching.
As for learners, they receive just-in-time feedback as they acquire new abilities. They needn't wait for lessons to be graded to see the results of their practice activities. Instead, automated analysis delivers feedback straight to their inbox, minutes after submitting a response.
When automated analysis is added to webcam, screen, or audio recording elements, learner audio is completely transcribed and analyzed for the following indicators:
Correct and incorrect keywords or phrases
Speech pace, measured in words per minute
Filler words and discourse markers
Confidence factors
After analysis, a complete transcript, including key performance indicators, is emailed to the learner. These results are also posted to the grading station when practice activities are set to be graded.
Before you begin:
You must have permission to create and edit content.
You must have permission to edit company settings.
Enable Automated Analysis
Navigate to the Settings menu by selecting the cog icon in the upper right corner of the Lessonly interface, then select Training Content Settings.
Under the In-Lesson heading, select the checkbox next to "Allow creators to add Automated Analysis on recorded practice elements."
π Note: If this setting is disabled, it means you don't have a TenantID; one will need to be provisioned on your behalf. Please contact your Account Manager for more information.
Add Automated Analysis to Practice Activities
Automated analysis can be added to Learning's recorded practice elements: webcam, screen, or audio recording.
To get started, navigate to the lesson builder, select Add Element, then select a Recorded Practice element to include in your lesson.
Automated Analysis settings are found within the element itself, as pictured below.
You can give learners four kinds of automatic feedback based on their responses: keyword matching, filler word usage, pace, and confidence. Each is discussed in greater detail below.
Keyword Matching
Keyword matching helps sellers stay on message by identifying correct and incorrect keywords in the transcript of a practice sales pitch. When keyword matching is enabled, learners will see exactly where and when they mentioned favored or disfavored keywords. This helps sellers speak a prospect's language, winning their trust in the process.
Once keyword matching has been enabled, you'll see two text fields, one for correct keywords or phrases and one for incorrect keywords. Add a text string to either field, then press Enter to validate your choices. To delete a keyword, select the X on its label.
π‘ Tip: As a best practice, select ten keywords or fewer, and restrict phrases to small groups of words. This makes processing more efficient, and it heightens the salience of the words you select for analysis.
After learners record and submit a response to your prompt, the audio is transcribed and analyzed for correct and incorrect keywords. The learner's transcript will be scored for accuracy based on the number of incorrect keywords and their incidence. This enables trainers to set targets for sellers to attain or surpass.
Note that keyword matching cannot identify keywords that were never spoken. Suppose it's imperative that Andy, a seller in training, uses the phrase "toner adhesion", but he plum forgets to mention it. Keyword analysis won't register the omission. Instead, use the practice element's answer key to bring such gaps to a learner's attention. Include a note like the one below, instructing graders to listen for and note any keywords a learner fails to mention.
Filler Word Usage
Filler words, or hesitation markers, are words or phrases such as "um", "like", and "you know". These words add no meaning to the content of your speech. Worse, they make you sound unconfident and unprepared, and they may distract a prospect from the core of your message.
To help sellers avoid using filler words, analyze their responses for examples of these. This will help learners home in on the parts of a pitch that need more practice. Unlike keyword matching, you don't need to specify the filler words to be identified; Learning listens for and identifies filler words based on an internal catalog of such words.
After analysis, the filler words in a learner's submission will be rendered as a percentage of all words in a transcript.
It's been estimated that filler words constitute 6 to 10 percent of what we say extemporaneously, but a sales pitch is anything but improvised! We recommend keeping filler words to 5 percent or less of a given transcript.
Pace Analysis
When the pace tool is enabled, learner audio is analyzed for its speaking rate, measured by words spoken per minute (WPM).
While there isn't an ideal speech pace per se, ensure that you're neither speaking so slowly that you frustrate or bore your audience, nor so quickly that you impair comprehension. Aim for a speech rate that falls between 140-160 WPM.
Confidence Analysis
This feature is currently in early-access testing. Want to join Seismic's First Wave beta program? Click here to sign up!
Confidence measures learners' self-assurance by analyzing the audio of their recorded responses. It gauges the confidence of a given reply by analyzing speaking cues such as tone of voice, level of hesitation, and more. A proprietary machine learning model judges these variables and returns one of three confidence scores: low, moderate, or high confidence.
Because it leverages a machine learning model, Learning's confidence analysis will improve the more you train it. Graders can use a feedback mechanism to confirm whether they agree with the confidence scores that automated analysis returns.
Over time, this will increase your confidence in Confidence.
Learner Point of View
Practice activities that include automated analysis are labeled as such. Learners can mouse over this label to read a brief description of automated analysis and the work it performs.
When learners complete automated analysis activities they're greeted by this message: "Your Automated Analysis results will finish processing soon. A transcript of the video you submit will be analyzed for things like correct and incorrect keywords, filler word usage, and words per minute. Results will be emailed to you when ready."
This email carries the subject line "Your results are in!" The message body includes a snapshot of the learner's performance according to whichever indicators were established when the practice activity was created.
Here's what a typical results email looks like.
See that View Automated Analysis button? When recipients select it they're navigated to their report card. Here, learners can read feedback left by their evaluators, when applicable.
Grading Station View
When practice activities are set to be graded, learner submissions are sent to the Grading Station. Here, graders can watch recordings, review automated analysis results, and give learners feedback.
To see the results of an automated analysis, select the video thumbnail; this will open the recording in a new window. The results of any analyses you've enabled will be displayed beneath the video.
Keyword Matching shows the correct and incorrect keywords identified by automated analysis. Correct keywords and phrases are highlighted in green; incorrect ones are highlighted in red.
Pace shows a color-coded gauge measuring the speaker's words per minute. Responses that fall within the green range unfurl at an ideal pace, while the yellow and red zones may indicate areas for improvement.
Filler Word Usage indicates the percentage of filler words identified in a user's submission. Green indicates a response with little or no filler words, yellow indicates a response with a moderate level of discourse markers, and red indicates a response that may be negatively impacted by ums, ahs, and other hesitation words.
A complete transcript of the learner's submission is also included. This section contains a transcript accuracy score that reflects the number of unknown, uncertain, or anomalous words identified during speech-to-text transcription.
The right-hand pane of the video window contains the usual options for grading a learner's submission. Graders must mark a response correct or incorrect before they can submit the grade.
π Note: Automated analysis results do not impact the manual grading process. Also, Automated Analysis is unavailable for audio with fewer than ten words detected.
Report Card View
After completing lessons that include practice questions with automated analysis, learners' reports cards will be updated with a copy of the submission they recorded. Selecting "See Automated Analysis Results" will launch a window displaying their results. Note that lessons awaiting a grade may need to be graded before learners can review their submissions.
In the response window, users can review the results of the automated analysis, including a transcript of their recording, and any feedback left by the grader.
If the question is not graded, users will still see the same view; but the results will be made available immediately and no feedback will be included.
Best Practices
Automated analysis is powerful. Used responsibly, it has the potential to transform the way organizations onboard, train, and coach learners. Precisely because automated analysis is so powerful, it's worth taking a moment to consider what constitutes responsible use of the technology, its potential uses and misuses.
Don't Evaluate. Empower.
When automated analysis is enabled, a user's recorded response is transcribed and assessed for correct and incorrect keywords. The results of this analysis should be used to enhance feedback, giving learners concrete suggestions for improving their skills. Results should not be used in performance evaluations or in hiring-and-firing decisions.
Be Transparent
Before introducing automated analysis, take a moment to share how you plan to use the tool in your training and enablement programs. Doing so will give you an opportunity to address any concerns your learners may have, and it will help them understand what you're looking for in an excellent submission. When learners encounter activities that include automated analysis, in-app tooltips explain how their recordings are analyzed and where they can find their results.
Protect Privacy
Suppose learners don't feel comfortable being recorded or having their submissions analyzed by an algorithm. In such cases, we recommend that trainers provide alternative kinds of practice. Learners can also use Learning's skip feature to opt out of automated analysis exercises.
Understand the Limitations
Be aware that automated analysis may not capture the nuances of accents, regional dialects, or variations in pronunciation.
Review, Learn, and Adapt
As you supplement coaching with automated analysis, monitor how users respond to the insights it provides. Look for opportunities to use such data to improve other parts of your training curriculum. For example, a pattern of incorrect keywords may point to outdated language in older training materials or marketing collateral.
The bottom line: Seismic is committed to the responsible and ethical use of automated analysis as part of a robust and effective training program. As we innovate and update our platform, we will strive to build unbiased and human-centered learning solutions.
Frequently Asked Questions
Q. Can I disable emails containing automated analysis results?
A. It's not possible to disable the results email at this time.
Q. I have suggestions for improving automated analysis. How can I share them with Seismic's product team?
A. Please share your suggestions with your CSM. He or she will route them to the relevant product team.
Questions? Contact the Support team at support@lessonly.com