Integrating research and system-wide practice in public health: lessons learnt from Better Start Bradford

Publication authors

Josie Dickerson, Philippa K. Bird, Maria Bryant, Nimarta Dharni, Sally Bridges, Kathryn Willan, Sara Ahern, Abigail Dunn, Dea Nielsen, Eleonora P. Uphoff, Tracey Bywater, Claudine Bowyer-Crane, Pinki Sahota, Neil Small, Michaela Howell, Gill Thornton, Kate E. Pickett, Rosemary R. C. McEachan, John Wright,

Abstract

Many interventions that are delivered within public health services have little evidence of effect. Evaluating interventions that are being delivered as a part of usual practice offers opportunities to improve the evidence base of public health. However, such evaluation is challenging and requires the integration of research into system-wide practice. The Born in Bradford’s Better Start experimental birth cohort offers an opportunity to efficiently evaluate multiple complex community interventions to improve the health, wellbeing and development of children aged 0–3 years. Based on the learning from this programme, this paper offers a pragmatic and practical guide to researchers, public health commissioners and service providers to enable them to integrate research into their everyday practice, thus enabling relevant and robust evaluations within a complex and changing system.

Using the principles of co-production the key challenges of integrating research and practice were identified, and appropriate strategies to overcome these, developed across five key stages: 1) Community and stakeholder engagement; 2) Intervention design; 3) Optimising routinely collected data; 4) Monitoring implementation; and 5) Evaluation. As a result of our learning we have developed comprehensive toolkits (https://borninbradford.nhs.uk/what-we-do/pregnancy-early-years/toolkit/) including: an operational guide through the service design process; an implementation and monitoring guide; and an evaluation framework. The evaluation framework incorporates implementation evaluations to enable understanding of intervention performance in practice, and quasi experimental approaches to infer causal effects in a timely manner. We also offer strategies to harness routinely collected data to enhance the efficiency and affordability of evaluations that are directly relevant to policy and practice.

These strategies and tools will help researchers, commissioners and service providers to work together to evaluate interventions delivered in real-life settings. More importantly, however, we hope that they will support the development of a connected system that empowers practitioners and commissioners to embed innovation and improvement into their own practice, thus enabling them to learn, evaluate and improve their own services.