A page to keep interesting reading on topics like evidence-based policy/practice, effective altruism, improving social science etc.

Here’s a table of contents:

  1. Evidence-based practice
  2. Communicating evidence clearly
  3. Using evidence to provide advice
  4. How can we support people to use evidence?
  5. Theories of change
  6. Methods
    1. Overviews
    2. Randomised control trials
    3. Evidence synthesis
    4. Effect sizes
  7. Research in crisis
    1. Solutions
  8. Criticisms

Evidence-based practice

This classic article by David Sacket is a good introduction to what evidence-based practice is and isn’t.

Communicating evidence clearly

Tom Chivers (@TomChivers) is one of my favourite science journalists. His book with David Chivers, How to Read Numbers, is great for tips on communicating stats. They have this concise statistical style guide too.

Case studies can be an effective way of providing rich examples of effective interventions working in practice. The What Works Centre for Wellbeing has produced some excellent guides to collecting and writing up case studies: https://whatworkswellbeing.org/resources/guide-to-effective-case-studies/ and https://whatworkswellbeing.org/resources/case-study-synthesis-centre-guide/

This looks like a useful tool from the Winton Centre for Evidence Communication on communicating risk: https://realrisk.wintoncentre.uk/.

The FT’s ‘visual vocabulary’ for charts has a good list of potential data visualisations. https://github.com/Financial-Times/chart-doctor/blob/main/visual-vocabulary/poster.png?utm_source=pocket_shared

Using evidence to provide advice

This article proposes ‘evidence readiness levels’ that can be used to communicate the quality of evidence behind advice.

This is a great checklist to gauge the ‘policy readiness’ of an idea

How can we support people to use evidence?

No one will read your journal article… How can you get them to pay attention?

Stephanie Waddell at the EIF has this sound overview of the state of the art, based on research, reflections from What Works colleagues, and EIF’s own experience. I made some notes on it here.

Susan Michie’s behaviour change wheel is a useful starting point for thinking of approaches to implementing evidence-based change.

Langer, L., Tripney, J. S., & Gough, D. (2016). The science of using science: Researching the use of research evidence in decision-making. A systematic review of research on using research – meta. Here’s a more recent review which highlights the lack of research on this question. Only 6% of the research dissemination initiatives they found had been evaluated.

https://transforming-evidence.org/ – a cool website collecting work of Kathryn Oliver and Annette Boaz. Lots of useful blogs and papers.

If you need some inspiration, this twitter thread has examples of RCTs leading to change.

Theories of change

I use theories of change all the time. It might be to plan a study or to plan a dissemination effort. Developing a theory of change is a useful tool to support clearer thinking about how an intervention works.

NPC have a detailed guide to developing TOCs.

Also, Innovations for Poverty Action have a similar guide.

Many researchers will be used to developing TOCs for other people’s interventions. This EA forum post makes the case for using them to plan your own attempts to get your research used.

Methods

Overviews

A curated list of methodological blog posts from the World Bank.

Randomised control trials

This Cabinet Office paper by Ben Goldacre and others is still a great, easy-to-understand introduction.

There are plenty of myths and misconceptions about RCTs in social science. This article does a good job of debunking them.

Evidence synthesis

I don’t have great online recommendations here. My best steer is to invest in two high-quality textbooks. They are worth the money if you want to understand synthesis.

An Introduction to Systematic Reviews from the experts at the EPPI-Centre has everything you’d need on systematic reviews.

Michael Borenstein et al’s Introduction to Meta-analysis does a great job of breaking down meta-analysis into bitesize and accessible chapters.

This is a very useful critique of common meta-analysis practice.

Effect sizes

This is a cool tool for representing Cohen’s D as two overlapping distributions. Great for getting a more intuitive sense of whether an effect really matters.

Matthew Kraft has this great framework for thinking about effect sizes across the social sciences. Kraft notes that’s Cohen’s original benchmarks for small, medium, large don’t really apply to real-world effectiveness studies. Kraft provides some alternative benchmarks for education studies. See also this review by the Centre for Global Development.

This chapter of the Cochrane handbook has a detailed discussion of different effect sizes.

I have my own piece on odds ratios.

Statistical significance, p-values, confidence intervals etc

This is an excellent and sound overview from my former colleague Guillermo.

This is a nice guide to myths and misconceptions. P-values aren’t the probability that your effect is true! Confidence intervals don’t have a 95% chance of containing the true effect!

Stephen Gorard’s book has a clear but very sceptical view on this stuff. I’m not sure I agree with everything in it but this book definitely improved my understanding.

This American Statistical Association statement and the associated papers provide very useful detail.

I’ve started my own modest attempt at explaining this stuff.

Research in crisis

At times it can seem that social science is in a death spiral. There’s the replication crisis, hyped-up reporting, outright fraud…

Stuart Ritchie’s book Science Fictions has the best overview of these issues.

Solutions

This optimistic study suggests that the replication rate improves dramatically when you follow good scientific practices.

Criticisms

Not everyone gets as excited about this stuff as you and me. Here are some critics.

A lot of social science relies on measuring outcomes using surveys. Here’s a critical take on surveys from a literal banana. If you dig around that site you’ll find plenty more weird-but-interesting thoughts on social science from the banana.

Seeing like a state is a classic critique of planned interventions. But also read this review.

Design a site like this with WordPress.com
Get started