Skip to main content

How do workflows support reporting and insights?

Understand how well-designed workflows make reporting easier, more reliable, and more useful for your program.

Written by Lauren Baird
Updated this week

Answer

Workflows support reporting and insights through the way they are designed. If you want reliable reporting, the workflow has to collect data in a consistent, structured way. In ANVL, reporting quality is mainly driven by four things:

  • question design

  • workflow template tags

  • question-level tags

  • scoring, when used

Good reports do not start in the dashboard. They start in the workflow.


Steps

Design questions for consistent reporting

  1. Write similar questions the same way across similar workflows.

  2. Use the same question type when the same data should be compared.

  3. Prefer structured responses over free-form text when you need reliable reporting.

  4. Avoid collecting the same kind of data in different ways across sites or templates.

Use workflow template tags consistently

  1. Apply the workflow template reportName to all workflows so the workflow itself is labeled correctly in reporting.

  2. Use consistent workflow template tags such as reportName, workflowType, and language tags where applicable.

  3. Review workflow template tags carefully before publishing, especially when workflows are used across multiple sites or programs.

Use question-level tags consistently

  1. Apply a question-level reportName to every question you expect to appear in reports.

  2. Use the same question-level reportName across workflow templates when the same question should roll up together in reporting.

  3. Review question-level tags carefully before publishing, especially when workflows are used across multiple sites or programs.

  4. Only questions with a correctly configured question-level reportName appear in the Workflow Drilldown report.


Only questions with a correctly configured reportName tag appear in the Workflow Drilldown report.

Use scoring intentionally

  1. Use Custom Scoring when you need comparison, prioritization, or weighted results.

  2. Use Strength Score when you want to measure completion quality or effort.

  3. Make sure any scoring supports a real decision and fits the purpose of the workflow.


Example of Custom Scoring of a Layered Process Audit.

Review reporting readiness before publishing

  1. Confirm the questions are designed consistently.

  2. Confirm the workflow template reportName and other template tags are applied consistently.

  3. Confirm the question-level reportName and other question tags are applied consistently.

  4. Confirm any scoring supports the intended reporting outcome.

  5. Test the workflow and review the results before broader rollout.

  6. That version makes the distinction much clearer:

    1. workflow template reportName = the workflow

    2. question-level reportName = the question


Did this answer your question?