Skip to main content

How do I validate and troubleshoot custom workflow scoring?

Verify that scoring and reporting behave as expected and resolve common issues before they affect users or insights.

Written by Lauren Baird
Updated yesterday

Answer

Validate custom workflow scoring by completing test workflows, then reviewing the results in ANVL Insights to confirm the percentage and points earned / possible points match what you expected.

This is important because even small setup issues can create missing scores, incorrect percentages, or misleading reporting. The most common problems usually come from:

  • missing or incorrect tags

  • non-numeric response values

  • incorrect maxScore values

  • not testing enough scoring scenarios


Steps

Note: Use the testing checklist at the bottom of this article to document your test scenarios and results.

1. Publish the workflow for testing

Publish the workflow to Demonstration / UAT or your approved testing site.

2. Complete test workflows in ANVL Workflows

  1. Open ANVL Workflows.

  2. If you were already signed in before the scoring changes were published, sign out and sign back in to see the latest version.

  3. Open the workflow.

  4. Complete all questions for each test scenario.

At minimum, test:

  • one low-score scenario

  • one expected / good scenario

  • one high-score / best-practice scenario

3. Review the scoring in ANVL Insights

  1. Open ANVL Insights.

  2. Select the Demonstration Group.

  3. Locate the completed test workflows.

  4. Open each completed workflow.

  5. Review:

    • points earned

    • possible points

    • percentage score

  6. Confirm the results match your expectations for each test case.

Validate the Workflow Score is correct based on the test case. Review individual question scores to verify responses.

4. Update and test again if needed

  1. If the score does not match expectations, review the scoring setup in Editor.

  2. Check the question design, response values, and scoring tags.

  3. Make the needed updates.

  4. Republish the workflow.

  5. Re-test the same scenarios.

  6. Repeat until the results are correct.

5. Move to live use

  1. Once the test results match expectations, publish the workflow to the live Group(s) for usage.

Common scoring issues

  • Scores not appearing
    Check whether the required scoring tags were added correctly.

  • Unexpected scores
    Check whether the response values or maxScore values are misaligned.

  • Scores are all too similar
    Check whether the scoring scale is too narrow to show meaningful variation.

  • Score feels wrong even though math is working
    Check whether the scored questions are actually measuring something useful for the decision you want to support.

Helpful notes

  • Validate scoring with more than one scenario.

  • Most scoring issues come from setup, not calculation.

  • It is easier to fix scoring problems before broad rollout than after users are already relying on the results.

Workflow Scoring Test Template

Workflow Name: __________________________
Workflow Type / Program: __________________________
Test Environment: UAT / Demonstration
Tester: __________________________
Date: __________________________


Test Setup Checklist (Before You Begin)

☐ Workflow is published to Demonstration (UAT) group
☐ Tester has signed out and back in to Mobile / Web Workflows to see the latest update
☐ Scored questions use ONLY the Checklist Radio question with numeric responses
maxScore tag is setup correctly on all scored questions


Test Case Planning (Custom Scoring)

At minimum, two test cases are required.
A third is recommended when workflows are complex.

Test Case

Intent

Test Case 1

Maximum score (all scored questions selected at highest value)

Test Case 2

Middle / mixed score (realistic combination of responses)

Test Case 3 (Optional)

Minimum score (all scored questions at lowest value)


Test Case 1: Maximum Score

Goal: Confirm the workflow can achieve the expected maximum possible score.

Responses to Use

  • Select the highest numeric value for every scored question

Item

Notes

Expected Score (Points / %)

__________________

Actual Score (Points / %)

__________________

Match?

☐ Yes ☐ No

Notes

__________________


Test Case 2: Middle / Mixed Score

Goal: Confirm scores calculate correctly for a realistic mix of responses.

Responses to Use

  • Combination of high, medium, and low numeric values

  • Reflects typical user behavior

Item

Notes

Expected Score (Points / %)

__________________

Actual Score (Points / %)

__________________

Match?

☐ Yes ☐ No

Notes

__________________


Test Case 3: Minimum Score (Optional)

Goal: Confirm the workflow can score at or near the minimum possible value.

Responses to Use

  • Select the lowest numeric value for all scored questions

Item

Notes

Expected Score (Points / %)

__________________

Actual Score (Points / %)

__________________

Match?

☐ Yes ☐ No

Notes

__________________


Score Review Checklist (After Submission)

For each test case, confirm:

☐ Custom score appears in the workflow summary
☐ Points earned match selected response values
☐ Percentage score reflects points earned / total possible
☐ Maximum score test reaches expected ceiling
☐ Mixed-score test produces a reasonable mid-range result


If Scores Do Not Match Expectations

Review in Editor:

  • maxScore values on scored questions (Checklist Radio questions ONLY)

  • Only using numeric response options (e.g. 1, 2, 5, 10)

After changes:
☐ Save
☐ Republish
☐ Sign out and back into ANVL Mobile / Workflows
☐ Re-test affected cases


✅ Final Decision

☐ Custom scoring behaves as expected
☐ Ready for production release
☐ Additional changes required

Reviewer Notes:




Did this answer your question?