Sometimes, people combine data that really don't belong together - conflict all over the place!
The statistical test shown by the I2 in a meta-analysis tries to pin down how much conflict there is in a meta-analysis. (A meta-analysis pools multiple data sets. Quick intro about meta-analysis here.)
I2 the shows the results of the chi-squared test (χ2 or Chi2). You will often see it in the forest plot. It is one way of measuring how much inconsistency there is in the results of different sets of data. That's called heterogeneity. The test is gauging if there is more difference between the results of the studies than you would expect just because of chance.
Here's a (very!) rough guide to interpreting the I2 result: 75% or more is "considerable" (an awful lot!).
Differences might be responsible for contradictory results - including differences in the people in the trials, the way they were treated, or the way the trials were done. Too much heterogeneity, and the trials really shouldn't be together. But heterogeneity isn't always a deal breaker. Sometimes it can be explained.
Want some in-depth reading about heterogeneity in systematic reviews? Here's an article by Paul Glasziou and Sharon Sanders from Statistics in Medicine [PDF].
Or would you rather see another cartoon about heterogeneity? Then check out the secret life of trials.
(Some of these characters also appear here.)
[Updated 5 November 2016.]
No comments:
Post a Comment