Goldilocks is right: that review is FAR too complicated. The methods section alone is 652 pages long! Which wouldn't be too bad, if it weren't that it is a few years out of date. It took so long to do this review and go through rigorous enough quality review, it was already out of date the day it was released. Something that happens often enough to be rather disheartening.
When methodology for systematic reviewing gets overly rococo, the point of diminishing returns will be passed. That's a worry, for a few reasons. For one, it's inefficient and more reviews could be done with the resources. Secondly, more complex methodology can both be daunting, and it can be hard for researchers to accomplish with consistency. Thirdly, when a review gets very elaborate, reproducing or updating it isn't going to be easy either.
It's unavoidable for some reviews to be massive and complex undertakings, though, if they're going to get to the bottom of massive and complex questions. Goldilocks is right about review number 2, as well: that one is WAY too simple. And that's a serious problem, too.
Reviewing evidence needs to be a well-conducted research exercise. A great way to find out more about what goes wrong when it's not, is reading Testing Treatments. And see more on this here at Statistically Funny, too.
You need to check the methods section of every review before you take its conclusions seriously - even when it claims to be "evidence-based" or systematic. People can take far too many shortcuts. Fortunately, it's not often that a review gets as bad as the second one Goldilocks encountered here. The authors of that review decided to include only one trial for each drug "in order to keep the tables and figures to a manageable size." Gulp!
Getting to a good answer also quite simply takes some time and thought. Making real sense of evidence and the complexities of health, illness and disability is often just not suited to a "fast food" approach. As the scientists behind the Slow Science Manifesto point out, science needs time for thinking and digesting.
To cover more ground, people are looking for reasonable ways to cut corners, though. There are many kinds of rapid review, including reliance on previous systematic reviews for new reviews. These can be, but aren't always, rigorous enough for us to be confident about their conclusions.
You can see this process at work in the set of reviews discussed at Statistically Funny a few cartoons ago. Review number 3 there is in part based on review number 2 - without re-analysis. And then review number 4 is based on review number 3.
So if one review gets it wrong, other work may be built on weak foundations. Li and Dickersin suggest this might be a clue to the perpetuation of incorrect techniques in meta-analyses: reviewers who got it wrong in their review, were citing other reviews that had gotten it wrong, too. (That statistical technique, by the way, has its own cartoon.)
Luckily for Goldilocks, the bears had found a third review. It had sound methodology you can trust. It had been totally transparent from the start - included in PROSPERO, the international prospective register for systematic reviews. Goldilocks can get at the fully open review quickly via PubMed Health, and its data are in the Systematic Review Data Repository, open to others to check and re-use. Ahhh - just right!
I'm grateful to the Wikipedians who put together the article on Goldilocks and the three bears. That article pointed me to the fascinating discussion of "the rule of three" and the hold this number has on our imaginations.
No comments:
Post a Comment