With all that I have been promoting the Kickstarter, you would think I wouldn’t highlight the limits of meta-analysis.
But progress depends on being honest about the strengths and weaknesses of any method. So let me tell you a story about when a meta-analysis didn’t really help much. This story is essentially the opposite of a meta-analysis, being simply an anecdote.
Young as I am (late 20s), I hurt my back. Badly. I hurt it 4 times in the past year. The most recent time, I was simply trying to walk down the street when it struck, bad enough for me to almost pass out on the street (losing most of your senses but retaining consciousness is a strange feeling).
Things got a lot better. But later on, about to go to Kenya for an impact evaluation planning workshop, I was a little nervous at the prospect of having to sit for a prolonged period of time, since that’s what caused my back to go out in the first place. So I returned to my physical therapist and she suggested I try “dry needling”.
I hadn’t heard of the term before, and to be honest I didn’t look into it too much. This particular physical therapist’s office had already proven its worth in gold, and I had great insurance, so why not give it a shot.
It wasn’t done to treat pain but to stretch out my hamstrings a bit, because your hamstrings can pull your back out of place. Before having the dry needling done, my right hamstrings were always tighter than my left. Every single time I stretched, for months, there was a visible difference — I had over 100 observations.
The physical therapist only dry-needled my right hamstrings, not my left. And afterwards, my right hamstrings, for the first time, stretched farther than my left. I wasn’t anticipating that, and I was honestly surprised and had to take a moment to attribute the cause, so I don’t think it was a placebo effect.
It seems pretty darn causal to me!
Yet I later heard that no firm conclusions have been drawn about dry needling from meta-analysis.*
Without looking at the paper, I could think of two main hypotheses to explain this:
1) Perhaps it has mixed effects and I just got lucky;
2) Perhaps the studies that went into the meta-analysis were of poor quality.
It turns out the reason for the lack of firm findings is much more mundane: they actually do not find many good-quality studies to include (see, for instance, the number of studies cited in Figure 3 on page 25). Honestly, when they get down to using one paper at a time, it stretches the definition of “meta-analysis”.
Still, the episode highlights a major problem meta-analyses face — they are only as good as the studies that go into them.
They also seem to generally put undo emphasis on the mean, when we may be just as interested in other moments. Impact evaluations are often guilty of this as well, reminding me of this comic.
Despite this, you can be sure that if we ever do get a better sense of the general effectiveness of dry needling and the conditions under which it works, it will be because a meta-analysis based on more good studies will have synthesized the data.
* That paper looks at low back pain rather than hamstring flexibility, but there is even less evidence for hamstring flexibility.