X
GO
Pathway Evaluation – Clinical Improvement, Audit, or Service Redesign

Following the launch of a new pathway, case-by-case audits of patient cohorts can help assess if outcomes are as expected. 

This is meticulous work and ideally your programme normally collects metrics automatically. In practice, the data often sits in multiple systems or on paper, and requires manual extraction. 

This sample evaluation could also be expanded to track a clinical suite of pathways.

Key questions or programme aims

  1. Does the pathway conform with current best clinical practice?
  2. Has the wider health system been consulted, and feedback incorporated?
  3. Were a clinical valuation and impact assessment conducted before the change was implemented?
  4. Establish a data baseline.
  5. What improvements or outcomes has the service redesign or clinical improvement made?
  6. Quantify the benefits of the change.
  7. Has the patient journey improved?
  8. Review the outcomes and system redesign process.

Indicators

  • Audits of patient flow and outcomes (pre and post)
  • Referral audit
  • Review feedback
  • Pathway-dependent 

Methods

  • Patient flow using a hypothetical clinical pathway 
  • Referral audit
  • Clinician surveys
  • Pathway-dependent

Publication database examples

Pathway Evaluation – Clinical Improvement, Audit, or Service Redesign

Key questions/Programme aims

What questions will identify if the programme is achieving its aims?

Outcomes/Programme logic

What difference do we aim to make? 

What do we expect to achieve? 

Who will benefit?

Indicators

How will we know if progress is tracking well? 

What changes will we look for? 

What indicators will help answer our key questions?

Methods/Data sources

What data will we collect and how?

Does the pathway conform with current best clinical practice?

  • Latest guidelines
  • National/state health priorities
  • Locally accepted best practice
  • Identify if the pathway is clinically accurate and up to date:
    • Reflects best practice and local agreement
    • Contains accurate local referral information
  • Log all issues being addressed via HP team/SME negotiation or the CWG process 
  • Identify barriers
  • Identify possible improvements
  • Pathway-dependent 
  • Audit selected pathways for:
    • Clinical quality
    • Referral accuracy
  • Review feedback
  • Pathway audits:
    • Clinical audits
    • Referral options
  • Feedback:
    • CWGs
    • SMEs/CEs
    • Clinicians
    • HP programme team

Has the wider health system been consulted, and feedback incorporated?

  • Resolve all identified issues or report them to the governance group, stakeholders, and/or CWGs
  • Review feedback
  • CWGs
  • SMEs/CEs
  • Clinicians
  • Allied health
  • Interviews with key clinical and non-clinical leaders
  • Interviews
  • Surveys
  • Patient perspectives
  • Feedback from patient experience units

Were a clinical valuation and impact assessment conducted before implementing the change?

  • Understand patient flow and the consequences of redesigns
  • Review hypothetical patient outcomes (pre and post)
  • Identify and rectify or mitigate any adverse outcomes before implementation
  • Understand change impact on:
    • Patient flow
    • Patient journey
    • The wider health system
  • Pathway-dependent
  • Consider:      
    • Referral decline rates
    • Quality of referral information
    • First specialist assessment, surgery, and follow-up rates
    • Wait times
    • Did not attend (DNA) rates
  • Pseudo pathway – Analyse patient flow using a hypothetical pathway 
  • Referral audit

Establish a data baseline

  • Develop audits and surveys, and obtain baseline data for care within the clinical streams undergoing evaluation
  • Log all issues being addressed via team/SME negotiation or the CWG process
  • Referral quality 
  • Review feedback
  • Pathway-dependent 
  • Consider:
    • Referral decline rates
    • First specialist assessment, surgery, and follow-up rates
    • Quality of referral information
    • Reduced testing
    • Wait times
    • Did not attend (DNA) rates
    • Community care rates
  • Feedback:
    • Key clinical and non-clinical leaders
    • CWGs
  • HP programme team
  • Hospital metrics (pre and post)
  • Referral audit (pre and post)­­­

What improvements or outcomes has the service redesign or clinical improvement made?

  • Improved:
    • Patient outcomes 
    • Patient flow 
    • Health equity 
    • Clinical relationships (primary, secondary, tertiary)
  • Reduced:
    • Costs 
    • Variations of care
  • Pathway-dependent
  • Consider:
    • Referral quality
    • Patient experience
    • Resources used, e.g. testing
    • Community care rates
  • Feedback:
    • Allied health
    • Hospitals
    • SMEs/CEs
    • Clinicians
  • Health-system metrics
    • Hospitals
    • NGOs
    • Allied health
  • Referral audits

Quantify the benefits of the change.

  • Is the service aligned with best practice?
  • Has referral quality improved?
  • Have costs reduced?
  • Has care in the community increased?
  • Identify improvements in:
    • Referral quality 
    • E-referrals 
    • The wider health system 
    • Health equity
  • Identify reduced variations of care
  • Review feedback
  • Pathway-dependent 
  • Consider:
    • Referral decline rates
    • First specialist assessment, surgery, and follow-up rates
    • Quality of referral information
    • Reduced testing
    • Wait times
    • Did not attend (DNA) rates
    • Community care rates
  • Feedback:
    • Key clinical and non-clinical leaders
    • CWGs
    • HP programme team
  • Hospital metrics (pre  and post)
  • Referral audit (pre and post)­­­

Has the patient journey improved?

  • Has patient access to specialist care improved?
  • Have patient outcomes improved?
  • Patient perspectives
  • Pathway-dependent
  • Feedback from patient experience units
  • Patient outcomes
  • Pathway-dependent
  • Hospital metrics (pre and post)
  • Clinician feedback
  • Qualitative feedback from clinicians
  • Surveys
  • Interviews
  • Feedback
  • Has patient access to specialist care improved?
  • Wait times
  • Did not attend (DNA) rates
  • First specialist assessment, surgery, and follow-up rates
  • Hospital metrics (pre and post)

Review the outcomes and system redesign process

  • Were the desired or expected outcomes achieved?
  • What lessons were learned during the process? Can they be shared?
  • Was this the correct approach?
  • Would you change anything if you had to do it again?
  • Patient perspectives
  • Pathway-dependent
  • Feedback from patient experience units
  • Impacts and outcomes
  • Differences between the actual and pseudo pathway
  • Pathway-dependent
  • Pathway-dependent
  • Clinician feedback
  • Interviews with key clinical and non-clinical leaders
  • Qualitative feedback from clinicians
  • Surveys
  • Interviews
  • Feedback:
    • Allied health
    • Hospitals
    • SMEs/CEs
    • Clinicians