By Aaron S. Edlin and Michael Love:
Knowing the magnitude and standard error of an empirical estimate is much more important than simply knowing the sign of the estimate and whether it is statistically significant. Yet we find that even in the most prestigious journals, when empirical social science researchers choose their primary findings – the findings they present in abstracts – the vast majority ignore this teaching and fail to account for the magnitude nor the precision of their discoveries. They provide no numerical results for 63% ± 3% of empirical economics articles and for a whopping 92% ± 1% of empirical political science or sociology articles between 1999 and 2019. Moreover, they almost never report any numerical results. precision (0.1% ± 0.1%) in the general results. Many social scientists seem committed to a culture of null hypothesis testing rather than a culture of estimation. There is another method: medical researchers routinely report numerical magnitudes (98% ± 1%) and precision (83% ± 2%) in key results. Trends suggest that economists, but not political scientists or sociologists, are turning to numerical reporting: the share of empirical economics articles with numerical results in titles has doubled since 1999, and economics articles with numerical results in titles obtain more citations (+19% ± 11%). .
Through someone on Twitter?