The Assessment gives learners instant feedback, in the form of a report, on the skills and behaviors you are looking for them to exhibit and develop in the simulations.
To create this report, the platform reviews the transcript of the learner’s conversation with the AI Character and then evaluates the learner’s performance based on the criteria listed in the Assessment section.
The report includes the following components:
Report title
Indicative score: average of the integer scores for all criteria
Graph: a plot with the integer scores for each criteria
Summary: a high-level overview of the strengths and improvement areas of the learner, given their performance on each criteria
Feedback + integer score per criteria: the strengths and improvement areas taking into account the learner’s performance in the simulation and the defined criteria
Recommendations: a list of the top areas for improvement for a future simulation attempt
Example of generated feedback report.
In the platform, the feedback report is under the ASSESSMENT section. The feedback report feature can be enabled and disabled by selecting the box to the left of “Enable assessment” (see image below).
Enable and disable feedback report.
In the ASSESSMENT section, there are three tabs: General, Criteria, and Preview.
In the GENERAL tab, you can choose whether or not to display the feedback report to learners and give your report a title and description, as well as a scale for the integer scores (e.g., 1 to 5, or 0 to 100). The Evaluation title, description, and max score will be displayed to learners in the PDF version of the report.
In the CRITERIA tab, you can add up to 6 criteria (via the “Add Criteria” button; see image below). For each criteria, you will stipulate a:
Title: name of criteria, which will be displayed in the report to learners
Description: detailed instructions for the Large Language Model (LLM) on how to evaluate the conversation with the AI Character; these instructions will be used by the LLM to output the integer score and personalized feedback. For additional guidance on how to craft evaluation criteria, check out this article.
Adding criteria to feedback report.
If you are working off of a template with pre-existing criteria, you can edit the criteria titles and descriptions as you see fit (see image below).
Edit criteria for the evaluation report.
As you refine the criteria, it is important to test the feedback reports that are generated after a conversation to ensure that the criterion descriptions are producing the desired input. Feedback report testing can be done individually, with other subject matter experts or collaborators, and with learners. To learn more about the process of honing criteria, we suggest reviewing Testing and iterating on feedback reports.