X
GO
Implementation Evaluation

This section provides an overview of setting up regular reporting and programme evaluation during the implementation phase of a new HealthPathways programme.

At this stage, your team may be focussed on developing content (localisation). This is important, but it is also important to develop and maintain your engagement activities, and have an evaluation strategy to monitor performance. 

This guide is only a suggestion – adapt this framework to meet your needs.

Build evaluation into your programme design. Plan your evaluation before beginning implementation to ensure the programme's expected outcomes are clearly defined and measurable. 

When planning implementation, consider the effort and time frame required for evaluation and allocate adequate resourcing. This will help ensure that the evaluation findings are available when needed to support future decision-making.

Identify programme objectives, key questions, and KPIs

When planning your implementation evaluation, consider any governance requirements, programme objectives, and KPIs. Can your current regular data gathering activities adequately address these? Engage key stakeholders and keep them informed when determining and reviewing the programme outcomes and key questions.

Consider:

  • Are the programme objectives being met?
    • How is the programme tracking towards pathway development goals, e.g. localisations, reviews?
    • Is site activity increasing?
  • Has the implementation of HealthPathways been successful for:
    • The programme team
    • Clinicians
    • The hospital system
    • Patients
    • The wider health system
  • Is health professionals knowledge and use of, and engagement with HealthPathways increasing?
  • Are there service redesign, clinical improvement, or localisation opportunities?
Effort required – capacity and capability

Consider how frequently you will evaluate your implementation, and the amount of effort and resourcing required. Does your team have the necessary capacity and capability to complete this work, considering other priorities such as localisation

The regular reporting and implementation evaluation processes need to be manageable and not arduous, especially since your team will be focussed on localising and generating content in the early stages. Also, it is important not to neglect engagement and outreach activities.

Implementation Evaluation Framework

The implementation evaluation framework has been created by selecting rows from the Basic Evaluation Frameworks. You could also incorporate elements from the Regular Monitoring Framework.

Key questions or programme aims

Implementation Evaluation

Key questions/Programme aims

What questions will identify if the programme is achieving its aims?

Outcomes/Programme logic

What difference do we aim to make? 

What do we expect to achieve? 

Who will benefit?

Indicators

How will we know if progress is tracking well? 

What changes will we look for? 

What indicators will help answer our key questions?

Methods/Data sources

What data will we collect and how?

Are the HP programme objectives and KPIs being met?

  • Review the evaluation frameworks for other relevant questions
  • How is the programme tracking towards pathway goals (localisations/reviews)?
  • Is the programme using its time efficiently?
  • Is site activity increasing?
Track programme indicators over time:
  • Localisations
  • Reviews
  • Drafts
  • Partial updates
Number of pathways or pages in development or draft, localisations, reviews (12 categories):
  • Pathways (clinical)
  • Request pages
  • Resources
  • HP platform
  • Investigations
  • Medication
  • Other
  • Our health system
  • Patient information
  • Procedures
  • Sections
  • Standing orders
  • Dot
  • Time and estimated cost of pathway localisations and reviews
  • Overall cost per pathway
  • Cost per specialty
  • Dot
  • HP programme team records 
  • Time and estimated cost of page views
  • Overall cost per pathway
  • Cost per specialty
  • Dot
  • HP programme team records 
  • Google Analytics
  • Track indicators over time
Positive trends in number of:
  • Page views
  • Sessions
  • Users
  • Google Analytics

Has the implementation of HP been successful for:

  • The programme
  • Clinicians
  • Are clinicians satisfied with the HP programme?
  • Are there any programme barriers or constraints?
  • Is the programme working as intended?
  • Review feedback
  • HP programme team feedback
  • Focus groups
  • Interviews
  • Surveys

Is health professionals' knowledge and use of, and engagement with HP increasing?

  • Increased participation in and engagement with HP
  • Update priority plans based on user feedback 
  • Increased user knowledge of and confidence in appropriate care and referral services available locally
  • Maintain continued usefulness of HP and use as a tool for communication
Each quarter, increase since previous year in number of:
  • Users 
  • New users 
  • Sessions 
  • Page views 
  • Google Analytics
  • Improved user experience
  • User feedback on HP's impact on the quality of consultations
  • Focus groups
  • Interviews
  • Surveys
  • Dot feedback
  • Engagement and outreach activities
  • Number of communications
  • Estimated reach of communications
  • Google Analytics and campaign tracking
  • Feedback
  • Education activities
  • Number of events
  • Estimated reach of communications
  • Google Analytics and campaign tracking
  • Feedback

Are your engagement activities effective?

  • Is your engagement strategy working as intended?
  • How has the engagement strategy achieved this?
  • Which activities are most effective?
  • Determine the extent and impact of your engagement, education, and outreach activities
  • Number of communications and events
  • Estimated reach of communications
  • Review feedback
  • Google Analytics and campaign tracking
  • Feedback
    • Clinicians
    • HP programme team
    • Stakeholders

Are there service redesign, clinical improvement, or localisation opportunities?

  • Identify and adjust development priorities
  • Identify service redesign opportunities
  • Reduce unlocalised page usage
  • Page views
    • Localised
    • Unlocalised
  • Search terms
  • Review feedback
  • CWG outcomes
  • Google Analytics
  • Feedback:
    • CWGs
    • SMEs/CEs
    • Clinicians
  • Enlist SME and primary care involvement in pathway localisation
  • Estimated number of contributors by pathway specialty
  • CWG feedback
  • Dot
  • HP programme team records
  • CWG feedback