X
GO
Referral Audits

Key questions or programme aims

Referral Audits

Key questions/Programme aims

What questions will identify if the programme is achieving its aims?

Outcomes/Programme logic

What difference do we aim to make? 

What do we expect to achieve? 

Who will benefit?

Indicators

How will we know if progress is tracking well? 

What changes will we look for? 

What indicators will help answer our key questions?

Methods/Data sources

What data will we collect and how?

Establish a baseline of referral quality.

  • Develop audits and surveys, and obtain baseline data for care within the clinical streams undergoing evaluation
  • Log all issues being addressed via team/SME negotiation or the CWG process
  • Review feedback
  • Pathway-dependent
  • Consider:
    • Referral decline rates
    • First specialist assessment, surgery, and follow-up rates
    • Quality of referral information
    • Reduced testing
    • Wait times
    • Did not attend (DNA) rates
    • Community care rates
  • Feedback:
    • Key clinical and non-clinical leaders
    • CWGs
  • HP programme team
  • Hospital metrics (pre and post)
  • Referral audits (pre and post)­­­

Quantify the benefits of the change:

  • Is the service aligned with best practice?
  • Has referral quality improved?
  • Have costs reduced?
  • Has care in the community increased?
Identify:
  • improvements in:
    • referral quality 
    • E-referrals 
    • the wider health system 
    • health equity
  • reductions in variation of care
  • Review feedback
  • Pathway-dependent
  • Consider:
    • Referral decline rates
    • First specialist assessment, surgery, and follow-up rates
    • Quality of referral information
    • Reduced testing
    • Wait times
    • Did not attend (DNA) rates
    • Community care rates
  • Feedback
    • Key clinical and non-clinical leaders
    • CWGs
    • HP programme team
  • Hospital metrics (pre and post)
  • Referral audit (pre and post)­­­

Determine which referral education and engagement activities are effective.

  • Increased engagement with HP by health professionals (primary and secondary clinicians)
  • Improved user experience
  • User feedback on HP's impact on the quality of consultations
  • Referral audits (pre and post)
  • Feedback
  • Surveys
  • Workgroups
  • Identify effective education and engagement activities
  • Number of promotional activities
  • Estimated reach of communications
  • HP programme team
  • Feedback
  • Increased use of HP
Positive trends in number of:
  • Page views
  • Users
  • Sessions
  • Google Analytics
  • Determine the effect of  education and engagement activities on referral quality
  • Referral quality
  • Change in decline rates
  • Referral audits (pre and post)

  • Has the patient journey or experience improved?
  • Have patient outcomes improved?
  • Patient perspectives
  • Pathway-dependent
  • Feedback from patient experience units
  • Has patient access to specialist care improved?
  • Has the patient journey improved?
  • Review feedback
  • Pathway-dependent 
  • Consider:
    • Wait times
    • Did not attend (DNA) rates
    • First specialist assessment, surgery, and follow-up rates
  • Feedback:
    • GPs
    • Specialists
    • Allied health
    • Hospitals
    • SMEs/CEs
    • Clinicians
  • Hospital metrics