Heterogeneity and the flaw of averages With interactive confounders explicitly included, the overall treatment effect β0 + β′zt is not a number but a variable that depends on the confounding effects. Absent observation of the interactive compounding effects, what is estimated is some kind of average treatment effect which is called by Imbens and Angrist (1994) a “Local Average Treatment Effect,” which is a little like the lawyer who explained that when he...
Read More »What is a statistical model?
What is a statistical model? My critique is that the currently accepted notion of a statistical model is not scientific; rather, it is a guess at what might constitute (scientific) reality without the vital element of feedback, that is, without checking the hypothesized, postulated, wished-for, natural-looking (but in fact only guessed) model against that reality. To be blunt, as far as is known today, there is no such thing as a concrete i.i.d....
Read More »Simpson’s paradox
[embedded content] From a more theoretical perspective, Simpson’s paradox importantly shows that causality can never be reduced to a question of statistics or probabilities, unless you are — miraculously — able to keep constant all other factors that influence the probability of the outcome studied. To understand causality we always have to relate it to a specific causal structure. Statistical correlations are never enough. No structure, no causality. Simpson’s paradox is an...
Read More »Logistic regression (student stuff)
Logistic regression (student stuff) [embedded content] And in the video below (in Swedish) yours truly shows how to perform a logit regression using Gretl:[embedded content] div{float:left;margin-right:10px;} div.wpmrec2x div.u > div:nth-child(3n){margin-right:0px;} ]]> Advertisements
Read More »Causality matters!
[embedded content] Causality in social sciences — and economics — can never solely be a question of statistical inference. Causality entails more than predictability, and to really in depth explain social phenomena require theory. Analysis of variation — the foundation of all econometrics — can never in itself reveal how these variations are brought about. First when we are able to tie actions, processes or structures to the statistical relations detected, can we say that we...
Read More »Proving gender discrimination using randomization (student stuff)
Proving gender discrimination using randomization (student stuff) [embedded content] div{float:left;margin-right:10px;} div.wpmrec2x div.u > div:nth-child(3n){margin-right:0px;} ]]> Advertisements
Read More »Ed Leamer and the pitfalls of econometrics
Ed Leamer and the pitfalls of econometrics Ed Leamer’s Tantalus on the Road to Asymptopia is one of my favourite critiques of econometrics, and for the benefit of those who are not versed in the econometric jargong, this handy summary gives the gist of it in plain English: Most work in econometrics and regression analysis is — still — made on the assumption that the researcher has a theoretical model that is ‘true.’ Based on this belief of having a...
Read More »The fundamental econometric dilemma
The fundamental econometric dilemma Many thanks for sending me your article. I enjoyed it very much. I am sure these matters need discussing in that sort of way. There is one point, to which in practice I attach a great importance, you do not allude to. In many of these statistical researches, in order to get enough observations they have to be scattered over a lengthy period of time; and for a lengthy period of time it very seldom remains true that the...
Read More »Modern economics — pseudo-science based on FWUTV
Modern economics — pseudo-science based on FWUTV The use of FWUTV — facts with unknown truth values — is, as Paul Romeer noticed in last year’s perhaps most interesting insider critique of mainstream economics, all to often used in macroeconomic modelling. But there are other parts of ‘modern’ economics than New Classical RBC economics that also have succumbed to this questionable practice: Statistical significance is not the same as real-world significance...
Read More »‘Modern’ economics — blah blah blah
‘Modern’ economics — blah blah blah A key part of the solution to the identification problem that Lucas and Sargent (1979) seemed to offer was that mathematical deduction could pin down some parameters in a simultaneous system. But solving the identification problem means feeding facts with truth values that can be assessed, yet math cannot establish the truth value of a fact. Never has. Never will. In practice, what math does is let macro-economists locate...
Read More »