Skip to content


DEVHa: Digitalisation of EValuation for HEE progrAmmes

We deliver HEE research internship, pre-doctoral and post-doctoral programmes for NHS nurses, midwives and allied health professionals. We propose that optimising evaluation through digitalisation will facilitate continuous monitoring and improvement.

Read comments 2
  • Proposal
  • 2022

Meet the team


  • Caroline Miller (UHB)
  • Professor Annie Topping (UoB)
  • Dr Ruth Pearce (UHB)

What is the challenge your project is going to address and how does it connect to the theme?

We deliver the Health Education England (HEE) West Midlands research internship, pre doctoral and post-doctoral programmes for nurses, midwives and allied health professionals in the NHS. We measure success by publications, presentations, grant applications, successful and submitted fellowship applications, career trajectory, and satisfaction. In our 2018 evaluation we used the Kirkpatrick model. Participants from 2014–2017 cohorts (n=82) completed an online survey and telephone interviews were conducted with participants (n=18) and line managers (n=7). This allowed us to understand positive and negative elements of the programme. Now, we want to evaluate and continuously improve, simultaneously, by digitalising our evaluation to achieve co-creation of shared spaces to identify successes and failures in real time. By working with stakeholders and getting real time data, we can use quality improvement and implementation science methodology to continuously monitor and embed improvements so HEE programme participants can maximise outputs (and impact) and attain enhanced outcomes.

What does your project aim to achieve?

Research supports practitioners to provide best, effective care. We want to ensure that the development of research aspirant clinicians, through the opportunities afforded by HEE programms, is the best it can be. To improve we must gather routine data (e.g number of publications) but DEVHa will take this further. By co-creating shared and individual on-line spaces we will identify nuanced impacts of the HEE programme – changes to patient outcomes, and ethos/vision in clinical teams. By identifying how our programmes enhance outcomes we can focus on the mechanisms which underlie success (or failure) and how success can be spread, scaled-up and replicated locally and elsewhere. Clinical academics have the potential to transform the NHS, but the gains from these transformations are often lost because evaluations lack sensitivity, and hence cannot identify them. Digitalisation could solve this problem. Our process evaluation will explain whether digitalisation ‘works’ –usability, inclusivity and NHS compatibility.

How will the project be delivered?

Retrospective data will be analysed over the 8 years that the program has run, followed by 12 months of prospective data.

We will digitalise prospective data collection using:

·       Structured progressive ePortfolios that evidence and support progression

·       An on-line platform where stakeholders* (HEE fellows, funders, education providers, line managers, clinical teams) determine whether our HEE program activities were implemented as intended and expected/unexpected outputs (process evaluation).

·       A digital space with potential to connect with other regional HEE program providers; not to share quantitative data, but to identify what good looks like (positive deviance).  Incentives (fruit baskets) will be sent to HEE providers who engage with this digital space.

We want this digital platform to encourage a positive way of constantly evaluating and then implementing improvement. ARIMA (Autoregressive Integrated Moving Average can be used to better understand the data (i.e. account for seasonal variation). The Chi Squared test will compare pre-and post-intervention data.

How is your project going to share learning?

Through the digital space that we develop, we will be sharing learning across the HEE research capacity building community from the start and we have cost in incentives to encourage shared learning.

We will use co-creation to work with a minimum of 8 stakeholders* in our region. We have not set out to involve patients but we have costed for the involvement of up to 4 patients if stakeholders deem it necessary. Patients may then facilitate co-creation of a digital space for HEE teams to share good practice, and aspirant clinical academics to share data which is often missed– i.e what is needed for them to increase research activity in their Trust.  The stakeholder panel will: i) meet twice yearly ii)disseminate  via twitter (hashtag #DEVHa)  iii) present results of DEVHa via a webinar series for NMAHPS nationally

We will present the DEVHa project at an appropriate national (or international) conference.

How you can contribute

  • Can Q Community help us to link up with others who have similar ideas?
  • Can Q members contribute methodological expertise as we develop this idea?
  • Can Q members help us to place this work in the wider context of national or regional drivers?

Plan timeline

1 Jul 2022 DEVHa launch; stakeholder group recruitment; retrospective time-series; design digital-spaces (positive-deviance/eportfolios).
1 Sep 2022 Launch the ePortfolios, begin prospective data collection
31 Oct 2022 Work on shared digital-space for positive deviance is completed, self-sustaining
31 Aug 2023 Prospective data analysis completes
1 Sep 2023 write up and preparation for conference presentation/publication
1 Dec 2023 project completion


  1. Hi Emma - if you haven't done so already, it may be worth connecting with the Evaluation SIG I'll post something in the group about this

    1. Thanks Jo - I have just connected with the SIG - this will be great moving forwards, thanks for letting me know.

Leave a comment

If you have a Q account please log in before posting your comment.

Read our comments policy before posting your comment.

This will not be publicly visible

Please note that you won't be able to edit or delete comments once posted.