What are AI Audits?
AI Audits automatically assess the quality of case notes against your organisation's documentation standards. Each note receives a quality score and constructive feedback, helping case workers improve their record-keeping and ensuring consistent standards across your team.
Why use AI Audits?
Consistent quality standards — Every note is assessed against the same criteria
Time-saving — Automated assessment frees supervisors from reviewing every note manually
Immediate feedback — Case workers get actionable suggestions for improvement
Performance insights — Track documentation quality trends across your team
Training support — Identify common gaps to address in team training
Accountability — Demonstrate that notes are regularly quality-checked
What Audits Measure
AI Audits assess documentation quality — whether case notes are complete, professional, and person-centred. This includes things like:
Is the date and time of interaction recorded?
Is it clear what happened and what was discussed?
Are actions and next steps documented?
Is the tone respectful and person-centred?
Are any risks or concerns noted appropriately?
Is there enough detail for a colleague to understand the situation?
Important: Audits measure how well the note is written, not whether the case itself is progressing well. A case worker could have excellent audit scores but the person they're supporting could still be struggling. For tracking outcomes, see the Theory of Change and Insights guide.
Setting Up Audit Criteria
Before auditing notes, you should define what good documentation looks like for your organisation.
Configuring Your Criteria
Go to Settings in the main navigation
Select Case Management
Find the Audit Criteria section
Enter your documentation standards in the text box
Click Save
What to Include
Your audit criteria should describe what you expect in a well-written case note. For example:
Good case notes should include:
Date and time of the interaction
Who was present and how contact was made (phone, in person, etc.)
What was discussed and any issues raised
The person's current situation and any changes since last contact
Actions taken during or after the interaction
Clear next steps with timeframes where possible
Any risks or concerns, even if minor
Evidence of person-centred language and approach
The AI uses these criteria when scoring notes, so be specific about what matters to your organisation.
Tips for Effective Criteria
Be specific — "Include relevant details" is vague; "Record what actions were taken" is clear
Prioritise what matters — Focus on the elements that make a real difference to quality
Keep it reasonable — Not every note needs to be a masterpiece; set achievable standards
Review periodically — Update criteria as your expectations evolve
How Auditing Works
Audit Scores
Each audited note receives a score from 0 to 100:
80-100 — Excellent documentation meeting all standards
60-79 — Good documentation with minor gaps
40-59 — Adequate but missing some important elements
Below 40 — Significant gaps requiring attention
The AI also provides written feedback explaining the score and suggesting specific improvements.
When Audits Happen
Audits do not happen automatically when notes are created. Instead:
Supervisors can audit individual notes manually
Bulk audits can assess multiple notes at once
Notes that haven't been audited are flagged for attention
This gives you control over when and how auditing happens.
Using the Audits Tab
Viewing Audit Results
Go to Cases in the main navigation
Click the Audits tab
Use the filters to narrow down by:
Case worker — See audits for a specific team member
Date range — Focus on a particular time period
You'll see a table of audited notes showing:
The note content (preview)
Case worker name
Audit score
Date of the note
AI feedback (click to expand)
Finding Unaudited Notes
The Audits tab also shows notes that haven't been audited yet:
Select a case worker (or "All case workers") from the filter
Scroll down to the Unaudited Case Notes section
You'll see notes from the selected date range that need auditing
Auditing Individual Notes
To audit a single note:
Find the note in the unaudited section
Click the Audit button next to it
Wait a few seconds for the AI to process
The note moves to the audited list with its score and feedback
Bulk Auditing
To audit multiple notes at once:
From the Audits tab, select "All case workers" or a specific case worker
In the Unaudited Case Notes section, you'll see all pending notes
Click Audit all to start the bulk audit
A progress indicator shows how many notes have been processed
Once complete, you'll see a summary of results
Bulk auditing is useful for:
Regular quality checks (e.g., monthly audit of all notes)
Catching up on a backlog of unaudited notes
Comparing quality across the team
Re-auditing Notes
If audit criteria have changed or you want a fresh assessment:
Find the audited note in the results table
Click the Re-audit button
The note is re-assessed against current criteria
The new score and feedback replace the previous ones
Supervisor Overrides
Sometimes you may disagree with an AI audit score. Supervisors can override:
When to Override
The AI missed important context
The score doesn't reflect the note's true quality
You want to provide your own feedback
How to Override
Find the audited note
Click to view the full audit details
Click Override score
Enter the corrected score (0-100)
Add your feedback explaining the override
Click Save
Overridden audits show both the original AI score and your corrected score, maintaining transparency.
Performance Analysis
Team Performance View
The Audits tab can generate a performance analysis for any case worker:
Select a case worker from the filter
Click Analyse performance
You'll see:
Average audit score
Number of notes audited
Breakdown of high/low scores
AI-generated analysis of patterns
Specific recommendations for improvement
This is useful for:
Supervision meetings
Identifying training needs
Recognising good performance
Tracking Trends
By running audits regularly and reviewing scores over time, you can:
See whether documentation quality is improving
Identify case workers who may need support
Demonstrate quality assurance to funders and regulators
Best Practices
Audit regularly
Set a routine (e.g., monthly) for auditing recent notes. This prevents backlogs and gives timely feedback.
Share feedback constructively
Use audit results as a development tool, not punishment. Focus on improvement and recognise progress.
Keep criteria up to date
Review your audit criteria periodically. As your service evolves, your documentation expectations may change.
Don't chase perfection
A score of 75 might be perfectly acceptable for a brief check-in call. Context matters — not every note needs to score 95.
Use insights alongside audits
Remember that audits measure documentation quality. Use the Insights feature alongside audits to track whether cases are actually progressing well.
Frequently Asked Questions
Are audits automatic? No. Audits are triggered manually, either one at a time or in bulk. Notes don't automatically get audited when created.
Can case workers see their audit scores? Yes, if they have access to the Audits tab. This transparency helps them learn and improve.
What if I haven't set up audit criteria? The AI will still provide scores based on general good practice for case notes, but results will be more useful if you've defined your specific expectations.
How long does an audit take? Individual audits take a few seconds. Bulk audits process notes in sequence and may take a few minutes for large batches.
Can I export audit data? Yes, audit results can be included in reports for supervision, quality assurance, and funder reporting.
Does auditing affect the case or the note? No. Auditing is purely assessment — it doesn't change anything about the case or the original note.
