Tuesday , November 5 2024
Home / Lars P. Syll / Meta-analysis — nothing but an exercise in mega-silliness

Meta-analysis — nothing but an exercise in mega-silliness

Summary:
Meta-analysis — nothing but an exercise in mega-silliness Including all relevant material – good, bad, and indifferent – in meta-analysis admits the subjective judgments that meta-analysis was designed to avoid. Several problems arise in meta-analysis: regressions are often non -linear; effects are often multivariate rather than univariate; coverage can be restricted; bad studies may be included; the data summarised may not be homogeneous; grouping different causal factors may lead to meaningless estimates of effects; and the theory-directed approach may obscure discrepancies. Meta-analysis may not be the one best method for studying the diversity of fields for which it has been used … Glass and Smith carried out a meta-analysis of research on class

Topics:
Lars Pålsson Syll considers the following as important:

This could be interesting, too:

Lars Pålsson Syll writes What statistics teachers get wrong!

Lars Pålsson Syll writes Statistical uncertainty

Lars Pålsson Syll writes The dangers of using pernicious fictions in statistics

Lars Pålsson Syll writes Interpreting confidence intervals

Meta-analysis — nothing but an exercise in mega-silliness

Including all relevant material – good, bad, and indifferent – in meta-analysis admits the subjective judgments that meta-analysis was designed to avoid. Several problems arise in meta-analysis: regressions are often non -linear; effects are often multivariate rather than univariate; coverage can be restricted; bad studies may be included; the data summarised may not be homogeneous; grouping different causal factors may lead to meaningless estimates of effects; and the theory-directed approach may obscure discrepancies. Meta-analysis may not be the one best method for studying the diversity of fields for which it has been used …

Meta-analysis — nothing but an exercise in mega-silliness

Glass and Smith carried out a meta-analysis of research on class size and achievement and concluded that “a clear and strong relationship between class size and achievement has emerged.”10 The study was done and analysed well; it might almost be cited as an example of what meta-analysis can do. Yet the conclusion is very misleading, as is the estimate of effect size it presents: “between class-size of 40 pupils and one pupil lie more than 30 percentile ranks of achievement.” Such estimates imply a linear regression, yet the regression is extremely curvilinear, as one of the authors’ figures shows: between class sizes of 20 and 40 there is absolutely no difference in achievement; it is only with unusually small classes that there seems to be an effect. For a teacher the major result is that for 90% of all classes the number of pupils makes no difference at all to their achievement. The conclusions drawn by the authors from their meta-analysis are normally correct, but they are statistically meaningless and particularly misleading. No estimate of effect size is meaningful unless regressions are linear, yet such linearity is seldom investigated, or, if not present, taken seriously.

H J Eysenck

Systematic reviews in sciences are extremely important to undertake in our search for robust evidence and explanations — simply averaging data from different populations, places, and contexts, is not. Meta-analysis would be a wonderful method if the assumptions held. However, they​ don’t!

Advertisements
Lars Pålsson Syll
Professor at Malmö University. Primary research interest - the philosophy, history and methodology of economics.

Leave a Reply

Your email address will not be published. Required fields are marked *