成人VR视频

Subscribe to the OSS Weekly Newsletter!

Even the Best Scientific Studies Can Lie: The Case of Craniosacral Therapy

Meta-analyses promise to give us the best answer science has, but as a recent case involving craniosacral therapy demonstrates, they can fall victim to 鈥済arbage in, garbage out.鈥

There are days when I wish I could say, 鈥淟ook for a meta-analysis that answers your question and trust it blindly.鈥 Do vitamin supplements work? Look for a meta-analysis! Are sausages bad for your health? Meta-analysis! Will an aspirin a day keep the doctor away? Meta-analysis!

Meta-analyses are often held up as the best form of scientific evidence. Forget relying on doctors reporting on a single case or a string of cases; forget depending on scientists looking backwards in time at a group of people who had an intervention versus another that did not; forget counting on prospective studies; forget even clinging onto a single randomized clinical trial. A meta-analysis has the power to put all of the studies together, synthesize their results, and produce a single magical number. This number can tell you if the intervention works or doesn鈥檛 work according to the best evidence we have. Why couldn鈥檛 we trust this magical number?

Although meta-analyses are wonderful, they can suffer from one problem: garbage in, garbage out. As an example, if a student takes four tests meant to assess the same aptitude but the tests were badly designed, the fact that the student鈥檚 mean score ends up being 92% is irrelevant. If the tests that go into the averaging machine are bad, the final result is just as bad.

We were recently sent on the use of craniosacral therapy for pain. The conclusion of this meta-analysis is that craniosacral therapy works. But does it?

What if medical doctors couldn鈥檛 even agree on a heartbeat?

Craniosacral therapy is a complementary or alternative intervention that focuses on rhythm. Imagine an egg with cracks in its shell. Now imagine the inside of this egg is pulsating, like there鈥檚 a living heart underneath the shell. The cracks would expand and contract in time with the pulse. Likewise, people who believe in craniosacral therapy think that the fluid around our brain and spine (called cerebrospinal fluid) similarly pulses, and the bones making up our skull expand and contract to the beat of this pulse. According to these believers, this pulse is different from the breath and from blood pressure. By touching and cupping certain parts of your body, craniosacral therapists claim they can feel this pulse, diagnose you based on its qualities, and treat you.

Whenever or craniosacral therapists have been asked in the context of a study to feel the craniosacral pulse of a dozen patients, they can鈥檛 agree on how fast the pulse is. (To be fair, those were small studies, but it is impressive how divergent these qualified and experienced therapists were in their assessments, with showing not a single occasion during which all three examiners agreed on the pulse rate in the 12 patients they examined.) There is in fact, so it鈥檚 probable they are feeling their patient鈥檚 heart rate, their own heart rate, or simply imagining things. And when it comes to displacing the plates of your skull to treat you, there is no evidence that these tiny movements do anything to the cranium. showed that a 20-kilogram force was needed to cause a movement of one millimetre; the 5-to-10 grams applied by a therapist鈥檚 fingers are nowhere near strong enough. If a light touch was this powerful, imagine the impact of jogging on your skull!

Despite all this damning evidence, here we are with. How do we explain this conclusion given that meta-analyses are the best form of scientific evidence?

Comparing anything to nothing will make anything look good

The authors of the meta-analysis found 10 published trials on the use of craniosacral therapy to treat pain, but if you don鈥檛 look up these individual trials, you won鈥檛 know just how bad they are.

was actually a feasibility study (鈥渃an we even do this?鈥), the preliminary results of which were presented via poster at a conference. No peer-reviewed paper, no actual trial to follow up. The results of two other trials actually argue against craniosacral therapy being particularly terrific: showed it was just as good as massaging muscle knots and had a negative result. But not content to let that negative result make a big splash, the authors of that second study dug deep and started comparing all sorts of things between their participants who got craniosacral therapy and those who got a classic massage to finally reveal that, drum roll please, there were differences in the levels of potassium (what?) and magnesium (double what?) in their blood (but not of sodium, chloride, phosphate, calcium, and lactic acid, which they also tested for). The more tests you run, the more likely you are to get a 鈥渄ing ding ding鈥 from the slot machine by pure chance.

We then come face to face with that did not have the right control group. If you鈥檙e in pain and want to know if someone correcting your supposed brain fluid rhythm is better, the question is, better than what? Better than not doing anything? Any intervention that doesn鈥檛 make the pain worse is bound to make you feel better. These five trials either compared against nothing, against resting on your back, against deactivated magnets, or against deactivated ultrasound. You need to compare this type of hands-on manual therapy to a different type of hands-on manual therapy to know if this particular intervention, with all of its pseudoscientific baggage, is actually better. Which leaves us with two studies which, while better, are far from perfect (see the expendable box for more detail).

For those who want to delve deeper into 鈥淭he last two studies of craniosacral therapy for pain鈥, click here

There are two studies on craniosacral therapy for pain that were part of the meta-analysis and that can鈥檛 be dismissed as easily as the other eight. The first is by and focuses on tennis elbow, while the second is by (the same people who authored the meta-analysis itself) and looks at chronic neck pain.

The sham group used by Nourbakhsh is quite interesting: the therapists performed a similar intervention with their hands but they directed the oscillating energy they generated away from the affected area. In terms of experimental design, this is much better than the rest. However, only 11 participants received the treatment and 12 got the sham intervention. These numbers are tiny and will invite a lot of noise in the data. The graphs showing the results for both groups have error bars so large (indicating the span of all individual results) you could drive an entire cohort of participants through them. This is not informative.

The second study, this one focusing on chronic neck pain, had 54 participants randomized to either craniosacral therapy or a light touch sham treatment. Every participant was told two different craniosacral therapies would be tested so as to thwart the expectation of a placebo arm. And the authors reported that on their primary outcome measure, pain intensity, there was a significant and clinically relevant effect of the genuine treatment over sham. I reached out to Dr. Christopher Labos, an associate in our office with a background in biostatistics, to take a closer look at the study. Apart from a few questionable findings鈥攖he exclusion of patients with a distinct neck pathology, the absence of confidence intervals, the lack of adjustment for multiple testings because secondary analyses were only exploratory鈥攚hat truly jumped out at him was the fact that people in the sham group took more medications and could be sicker. Indeed, looking at Table 2, we can see visually significant differences between the group that received craniosacral therapy and the group that got a sham: 0% of the former regularly took pain medication at the start of the trial vs. 3.9% of the latter; 25.9% took it when needed vs. 53.9%; and 48.2% of the treatment group had a history of taking pain medication vs. 70.4% in the sham group. None of the p-values calculated were significant when comparing the two groups, which Dr. Labos attributes to small samples. But clearly, participants in the sham group were relying a lot more on medication than those in the treatment group. These two groups were thus not equivalent and it鈥檚 no wonder that treatment was seen to be superior to sham in people who had to use less pain medication in the first place.

Meta-analyses can be flawed

What we are left with is a celebratory meta-analysis (impressive!) of ten studies, eight of which can be dismissed as being either negative or poorly conceived (not so impressive), one of which had too few participants, and the last one of which compared two groups that were actually different in their pain medication use. Can craniosacral therapy be relaxing? Can it help people suffering from chronic pain feel a little bit better? Yes. But its underlying belief system about the craniosacral rhythm is simply made up. Whenever seasoned professionals have been tested, they have been found to profoundly disagree on what it is that they are feeling through their fingers. As for meta-analyses, they can be incredibly informative provided you feed them the right building blocks. You can lay down a house foundation made of brittle plastic and paint it the colour of concrete, but I wouldn鈥檛 use it to support much of anything. Garbage in, garbage out.

Take-home message:
-A meta-analysis summarizes all of the results scientists have for a particular question to give the best answer
-If the studies that go into a meta-analysis are garbage, the meta-analysis itself will be garbage.
-Craniosacral therapy claims to diagnose and treat people based on the pulse of the fluid around the brain and the movement of skull bones, but despite a recent meta-analysis reporting it works for pain, there is no evidence this pulse exists and there is data to show that therapists can鈥檛 even agree on the pulse.


on this article!

Back to top