Randomised controlled trials can provide strong evidence of whether complex social and psychological interventions are effective. Most of the programmes listed on Investing in Children have evidence of impact from RCTs. But inadequate information in reports on RCTs makes it harder to trust the results. An international initiative is seeking to change this by developing new reporting guidelines.
A key advantage of RCT findings – where similar trial participants have been randomly allocated to take part in programme or control groups – is that if significant changes occur between the two they can reasonably be attributed to the intervention. But drawing that conclusion may depend on how well the trial has been conducted. To interpret the findings, accurate and transparent information is needed about intervention designs, the way they are delivered and the level of uptake.
There is also a need for policy makers, service commissioners and practitioners to understand what use they can make of the findings. To do that they need information about the social and environmental context in which the programme was applied, as well as details about the way it was implemented. Without that knowledge, there is a danger that the resources invested in often-expensive RCTs will be wasted.
The researchers in the UK and Canada planning to develop the new reporting guideline argue that examples of poorly reported RCTs can be found across disciplines such as criminology, social work, education and psychology. Prevention and treatment programmes in these fields often have multiple strands, packaging different kinds of intervention while aiming to improve more than one outcome.
An example of this complexity would be Multisystemic Therapy (MST), a respected programme for young people with chronic behaviour problems that targets individual, peer, family, school and neighbourhood influences. Therapists and case workers provide tailored services not only for the individual young person, but also other family members. These may be provided in the home, schools, social care facilities or community.
In medicine, where RCTs are accepted as ‘gold standard’ evidence, researchers have developed guidelines for reporting health-related interventions. The Consolidated Standards of Reporting Trials (CONSORT) Statement 2010, is credited with improving the way thousands of medical experiments have been reported and is applied by more than 600 academic journals worldwide.
However, efforts to add “bolt-on” elements to CONSORT requirements in ways that are relevant to social and psychological interventions (SPI) have not gained widespread acceptance. What University of London researcher Evan Mayo-Wilson and colleagues in Oxford, Belfast and Ottawa now consider to be needed is CONSORT-SPI, a purpose-designed extension to the guidelines.
They argue that the extension needs to focus especially on issues concerned with “internal validity” – the extent to which results may be influenced by bias. Information about randomization procedures is often lacking, and studies often fail to make clear whether those assessing outcomes were “blind” to whether data was gathered from intervention or control group participants. Information on the validity and reliability of measures used to assess subjective outcomes, such as changes in behaviour, is also needed, especially if they are not already well known.
The researchers also suggest that existing CONSORT guidelines do not pay enough attention to “external validity”: the extent to which study results apply in other settings or populations. To be truly useful to policy makers and practitioners, RCT reports should describe the key components of interventions, how they should be delivered, and a “theory of change” concerning their relationship to the outcomes being targeted.
It is also important that reports say how the planned interventions were actually implemented and received by participants – which may not always be as they were designed.
A further area of weakness new guidelines could remedy is the lack of information about how trial participants were recruited – not least because this may have been different from the way that service users normally access programmes. Changes introduced by the researchers may, consequently, affect the characteristics of participants.
Other areas where new guidelines would elucidate the practical implications of a study include information about the nature and level of administrative support during a trial, such as staff training and supervision. Providing contextual information about service systems and relevant political or social circumstances would also be helpful.
This is especially important since an intervention that works for one group of people may not work for people living in different cultures or physical spaces, or for people with slightly different problems. Decision makers need help to decide whether a programme evaluated somewhere else at another time is appropriate in the here and now.
The new reporting guidance is being developed as an official extension of the CONSORT Statement. Stakeholders with expertise from all related disciplines and from all regions of the world will be consulted and a consensus sought. In addition to a checklist and flowchart, the initiative will produce a document that explains the scientific rationale for each recommendation and provides examples of clear reporting. Those wishing to be involved can sign up at http://tinyurl.com/CONSORT-study
RCTs are not, as the CONSORT-SPI development team acknowledge, the only valid method for evaluating preventive and other psychosocial interventions. But they are also not the only type of study that would benefit from better reporting. Success in developing new guidelines for trials will hold implications for the way that other research findings are reported as well.
Mayo-Wilson, E., Grant, S., Hopewell, S., Macdonald, G., Moher, D. & Montgomery, P. (2013) Developing a reporting guideline for social and psychological intervention trials, Trials 14: 242.Return to Features