So what have we learnt? Time to face up to the challenge of sharing our knowledge
Now we are a year into the second round of CLAHRC funding, following the nine ‘pilot’ CLAHRCs which were funded by NIHR and partners from 2008 to 2013, I’ve been asked by people new to CLAHRC what we learnt from the ‘pilots’. And I realised that I wasn’t very clear about this. I knew that there had been some external research evaluations funded by the NIHR, I knew that some CLAHRCs had also carried out their own internal evaluations, and I knew (from experience) that those of us who had received ‘pilot’ funding had reviewed the way we did things and our achievements in putting together our bids for further funding, which was awarded in 2013 for a January 2014 start. So I turned to our own CLAHRC team, and specifically to my colleague Roman Kislov, whose PhD was funded through our pilot CLAHRC and who is now a Research Fellow here, and together we tried to find out.
The results of our investigations are now available in a briefing, and from my perspective it’s all quite dispiriting. We started with a focus on ‘research’ about CLAHRCs and, whilst there is nothing wrong with academics writing papers as a result of their studies (indeed, that is what we are supposed to do!), we found that most academic outputs were too late, too theoretical or too partial and focused only on one aspect. Advancing theory is important but it seems to be disproportionately represented in the evaluations published to date. We found a lack of research on PPI, sustainability of change and collaboration between the CLAHRCs, and with their various partner organisations. Issues identified by the Directors of the pilot CLAHRCs themselves, including maintaining matched funding resources, ensuring that the full range of NHS staff are engaged and the need to demonstrate academic outputs as well as improvements in care, were also not covered.
Which leads me (as a CLAHRC Director) to the more ‘practical’ side of things. Because the main outputs are academic papers, there is a relative lack of data about the implications for those who are actually ‘doing’ CLAHRC business. Even where academic research might have provided suggestions, the time delays involved in the academic publication process, and in the process of publication of NIHR-funded project reports, makes a nonsense of the notion of formative evaluation. So I might have expected these academic outputs to be supplemented by publicly accessible literature/reports (rather than ‘academic’ papers) with a more pragmatic focus but this wasn’t the case. In our own CLAHRC I now realise we haven’t made available some of our own learning.
I’m in the relatively privileged position of having been involved at a senior level in a pilot CLAHRC and now directing a CLAHRC, and as a result I know a lot of people across CLAHRCs. But not everyone does – including many of the incoming ‘new’ CLAHRC staff in the (now) 13 CLAHRCs. We’ve missed opportunities to share our own learning and inform the direction of CLAHRCs so far. We now need to face up to the challenge of sharing our knowledge – in ways other than academic papers if it is to be useful and timely – and this puts the onus on the CLAHRCs themselves to design and conduct rigorous and ‘useful’ local evaluations on issues of common interest. I guess there are also challenges for the commissioning and reporting of external evaluations, but that’s a different topic…
Share your thoughts with Ruth using the comment section below or on Twitter via @RuthBoaden.