Everyone knows that there's error in all statistical estimates, but we don't always think about all the implications of that error. Gelman makes the important point about ratios of (among other things) regression coefficients, with implications for instrumental variables (and a cringe-inducing published example). He notes that the ratio of two normal-like variables is Cauchy-like; my recollection from Don Rubin's causal inference class is that he criticized instrumental variables on the grounds that Cauchy distributions have infinite variance, which fills in why Gelman notes that in theory ratios of regression coefficients can take on absurdly large and useless values like 100,000.
The published example comparing regression coefficients is something that people do all the time in conversation without even thinking about it.
Gelman says that the post was really time-consuming to write, so perhaps it counts as a few posts. I agree that it should. In fact, someone could review papers in top journals in the past 5 years and document how frequently this error is made, and that would be a good project.