Authors

Last month NHS England published PPL’s independent evaluation of virtual wards in the South East of England. The report was the first truly large-scale evaluation of virtual wards anywhere in England, and though we worked within a set scope, the report has sparked rich conversation amongst colleagues and partners on the wider implications of our findings.

We saw this excitement first-hand NHS Confed this month and had the pleasure of hearing how colleagues across England are using the report to create change. The report has:

  1. Helped change minds – we know from experience that stakeholder buy-in, from clinicians to families, is a necessary condition for successful virtual ward implementation. The strength of evidence in our report has helped tip the scale in terms of bringing people onside in the rollout of virtual wards nationally.
  2. Shared practically useful lessons – our close engagement with ICB programme leads highlighted the immediate need for practically-applicable learning, and our conclusions on the importance of scale, longevity, staffing, choice of operating model, etc. are already informing improvement work across England.
  3. Shared a methodology that uses data available now – our mixed-method analysis combines large-scale correlative analysis and SPC modelling – making the most of the best data we currently have – with extensive qualitative analysis. Colleagues are already adapting this approach in their analyses of step-up care which is difficult to study using standard control-based methodologies with current available datasets.
  4. Made the case for patient and VCSE involvement – partners working to understand patient experience, inequalities in access, the impact on unpaid carers, housing inequality, and digital exclusion have used the report to make the case for their in virtual ward development to ensure these risks are mitigated.
  5. Shifted the conversation towards delivery – our report, in conjunction with positive results from other evaluations, has shifted the conversation away from evaluation towards delivery and continuous improvement that learns from the strongest case studies.

Seeing our work highlighted alongside local evaluations pointed to the growing weight of evidence suggesting virtual wards, when done right, can help realise key outcomes from improved patient experience, increased hospital productivity, and much besides. The sector’s response to urgent calls for robust evaluative work of virtual wards, particularly around equities, is certainly something to be proud of.

In designing and delivering our evaluation we were in constant conversation with NHSE colleagues, ICB Programme leads in the South East, and our expert advisors. The release of the report to wider audiences has prompted a range of questions, the most frequent of which are set out below with our answers:

Q: Why did you not include X?

Perhaps the most frequently asked questions were to do with omissions, including:

  • Why did you not analyse step-down as well as step-up virtual ward admissions?
  • Why did you not quantify impact on activity and costs beyond UEC?
  • Why did you not analyse every single virtual ward in the South East?

The answer to all these questions is that they fell outside the agreed scope of the evaluation. We absolutely recognise that a large portion of the patient and system benefits of virtual wards were not captured in our report and we are already working with our partners to deliver evaluations that answer these remaining questions.

Q: Why did you choose correlative analysis over regression?

The quantitative element of the evaluation found that “on average 1 NEL admission ‘avoided’ was shown to be correlated with 2.5 virtual ward admissions” and does not claim to have demonstrated causality purely through its quantitative data analysis. Our NHSE clients invested significant resources to standardise aggregate virtual ward admission and financial data to enable a study at this scale, given the lack of linked patient-level data. We understand the limitations of correlative analysis so we:

  1. Identified narrow population segments for which a given virtual ward looks to reduce admissions by focusing on “the relevant cohort (age, condition, geography) to minimise ‘noise’ when looking for correlations” caused by confounding factors.
  2. Used an SPC hospital admissions forecast that drew on the trust in question’s historical data, accounting for annual cycles, rather than constructing control groups using data from areas that have not rolled out virtual wards. We ran ‘residual sum of squares’ tests for model fit using a range of control groups, but none of these performed better than the SPC model.
  3. Did not include results based on poor-quality data. We set a high bar for completion of age, condition, and geography coding to ensure we could segment populations effectively.

In order to stay in line with the letter and spirit of the Magenta Book, we triangulated our quantitative approach with detailed qualitative analysis to ensure the weight of evidence was sufficient to draw robust conclusions around drivers of impact.

Q: Why did you not present your results in standard statistical format?

To keep the report’s results clear and understandable to the widest possible audience, we set out detailed numerical outputs and assumptions in an appendix and sought to achieve a balance between technical detail and readability in the main body of the report.

Q: Why are you claiming that virtual wards work if not all virtual wards in your evaluation showed a significant positive impact on hospital admissions?

This work was not intended to separately evaluate each virtual ward in the South East. It was intended to evaluate the impact of the South East’s virtual ward offer on hospital admissions in aggregate. The breadth of our current definition of what counts as a ‘virtual ward’ mean we expected large variations in results from the outset, and we do not claim that every virtual ward reduces admissions.

Q: How can virtual wards be associated with an overall net positive financial impact if the majority of individual wards were not associated with a net positive financial impact?

Understanding the importance of scale is key here – larger virtual wards are associated with lower per-patient costs and thus a net positive financial impact, meaning virtual wards with net positive financial impact have an outsized effect on the South East average.

Q: If virtual wards are saving us money, where is the money?

Our report deliberately refers to the ‘value of avoided admissions’ or the net financial benefits ‘associated’ with a virtual ward because these savings only become ‘cashable’ if hard decisions are made to close hospital beds. However, reducing length of stay or avoiding non-elective admissions can have productivity benefits for hospitals, enabling greater patient throughput and reducing waiting times for both emergency and elective care for the same level of investment.

Q: Has the rise in ‘corridor care’ in some trusts distorted your findings?

The importance of accounting for corridor care was highlighted to us by South East stakeholders. Where corridor care was present in hospital data we always ensured to include that data in overall hospital admission data to ensure rises and falls in admissions were fully captured.

Our evaluation looked to extract the most value from the best available evidence in England to answer core questions about the intervention’s efficacy. We know that “patient-level analysis at a regional level would mean more effective controlling for exogenous factors”. We certainly don’t think this is the final word in virtual ward evaluations – we are already working with partners to deliver even more work in this area.

However, we don’t believe in waiting for the perfect dataset before delivering evaluations, particularly as hospitals hit capacity and waiting lists grow, and we believe that “the scale of this evaluation means meaningful conclusions can still be drawn”. We continue to provide operational support for virtual ward programmes, so we understand that evaluations like this are a means to an end – delivering better, more sustainable services to patients.

Our full independent evaluation for NHS England can be read here.