On the non-applicability of statistical models Eminent statistician David Salsburg is rightfully very critical of the way social scientists — including economists and econometricians — uncritically and without arguments have come to simply assume that they can apply probability distributions from statistical theory on their own area of research: We assume there is an abstract space of elementary things called ‘events’ … If a measure on the abstract space of...
Read More »Econometrics — still lacking valid ontological foundations
Econometrics — still lacking valid ontological foundations Important and far-reaching problems still beset regression analysis and econometrics — many of which basically is a result of unsustainable ontological views. Most econometricians have a nominalist-positivist view of science and models, according to which science can only deal with observable regularity patterns of a more or less lawlike kind. Only data matters and trying to (ontologically) go...
Read More »Linearity — a dangerous assumption
Linearity — a dangerous assumption [embedded content] Advertisements
Read More »P-hacking and data dredging
P-hacking and data dredging P-hacking refers to when you massage your data and analysis methods until your result reaches a statistically significant p-value. I will put it to you that in practice most p-hacking is not necessarily about hacking p-s but about dredging your data until your results fit a particular pattern. That may be something you predicted but didn’t find or could even just be some chance finding that looked interesting and is amplified...
Read More »Time to abandon statistical significance
Time to abandon statistical significance We recommend dropping the NHST [null hypothesis significance testing] paradigm — and the p-value thresholds associated with it — as the default statistical paradigm for research, publication, and discovery in the biomedical and social sciences. Specifically, rather than allowing statistical signicance as determined by p < 0.05 (or some other statistical threshold) to serve as a lexicographic decision rule in...
Read More »Big Data — Poor Science
Big Data — Poor Science Almost everything we do these days leaves some kind of data trace in some computer system somewhere. When such data is aggregated into huge databases it is called “Big Data”. It is claimed social science will be transformed by the application of computer processing and Big Data. The argument is that social science has, historically, been “theory rich” and “data poor” and now we will be able to apply the methods of “real science” to...
Read More »‘Autonomy’ in econometics
The point of the discussion, of course, has to do with where Koopmans thinks we should look for “autonomous behaviour relations”. He appeals to experience but in a somewhat oblique manner. He refers to the Harvard barometer “to show that relationships between economic variables … not traced to underlying behaviour equations are unreliable as instruments for prediction” … His argument would have been more effectively put had he been able to give instances of relationships that...
Read More »James Heckman — ‘Nobel prize’ winner gone wrong
James Heckman — ‘Nobel prize’ winner gone wrong Here’s James Heckman in 2013: “Also holding back progress are those who claim that Perry and ABC are experiments with samples too small to accurately predict widespread impact and return on investment. This is a nonsensical argument. Their relatively small sample sizes actually speak for — not against — the strength of their findings. Dramatic differences between treatment and control-group outcomes are...
Read More »Understanding the limits of statistical inference
Understanding the limits of statistical inference [embedded content] This is indeed an instructive video on what statistical inference is all about. But we have to remember that economics and statistics are two quite different things, and as long as economists cannot identify their statistical theories with real-world phenomena there is no real warrant for taking their statistical inferences seriously. Just as there is no such thing as a ‘free lunch,’...
Read More »Econometric alchemy
Thus we have “econometric modelling”, that activity of matching an incorrect version of [the parameter matrix] to an inadequate representation of [the data generating process], using insufficient and inaccurate data. The resulting compromise can be awkward, or it can be a useful approximation which encompasses previous results, throws light on economic theory and is sufficiently constant for prediction, forecasting and perhaps even policy. Simply writing down an “economic...
Read More »