Ezra Klein thinks he understands the real reasons Nicholas Kristof is frustrated with academics. It’s not that we write badly. That’s actually good, at least for journalists’ livelihoods: Since academics “write in jargon but speak in English,” journalists can “arbitrage,” translating academics’ work for the public. So, Klein is not focused on helping academics reach the public themselves, as I was in my reaction to Kristof.
Klein believes journalists’ real problems with academics are that academic journals are “wildly expensive” and there is no “academic equivalent of a best-seller’s list,” making it hard to find interesting papers.
I am skeptical. I think Klein vastly under-estimates the distance that most academic writing and most academics would have to travel to be relevant and understandable to journalists. His suggestions, while laudable, won’t bridge much of that gap.
My skepticism comes from my own experiences in academia, particularly too many hours scrutinizing endless equations or convoluted writing trying to tell if an academic paper in my own field was convincing or useful. And I fear that many academics cannot extract from our work what matters to the public and explain it clearly. In fairness, much of what we academics do is intrinsically complex.
I suspect that Klein is optimistic about journalists and academics’ papers because he unconsciously envisions many academics like the ones he hangs out with. They explain well and have their pulse on what matters for the public. And there is another problem with Klein’s optimism: quite frankly, most journalists are not going to “get” complex analytical material as well as he does.
To illustrate why I think Klein’s proposed changes would have limited impact, here’s a thought experiment. Imagine Klein (or another journalist) coming into my world, where I can identify the best-sellers and my university pays the expensive subscriptions. Conveniently for this exercise, health care policy is something both Klein and I work on.
Our first step: find the “best-selling” academic papers. Klein is right that for many working papers, this is by word of mouth. But for published papers, it’s fairly easy. Journals vary by topic and prestige. One place to identify the most prestigious journals in a particular area is eigenfactor.org. There we find the prestigious Journal of Health Economics (JHE), an expensive Elsevier journal, and therefore a target of the boycott Klein describes.
JHE’s home page is available for free, and it lists “best-sellers,” the most downloaded papers. (Journalists could also check how many times a paper has been cited on Google scholar.) JHE’s most downloaded paper’s abstract is also available for free, although for those without subscriptions, the paper itself costs $40!
Here’s the abstract: We examine the implications of policies to improve information about the qualities of profit-seeking duopoly hospitals which face the same regulated price and compete on quality. We show that if hospital costs of quality are similar then better information increases the quality of both hospitals. However, if the costs are sufficiently different improved information will reduce the quality of both hospitals. Moreover, even when quality increases, better information may increase or decrease patient welfare depending on whether an ex post or ex ante view of welfare is taken.
Will the imaginary journalist be interested in the “best-selling” article? My guess: Klein or journalists regularly covering health care policy would be interested. They know there are people who want an answer to “Does competition between hospitals improve their quality?”
Our next step is to look at the paper itself, making use of the coveted free access. What do we see? 37 equations, not counting the equation-dense appendix, some of them 4 lines long, and 8 highly involved diagrams.
Would this scare off journalists, even the really wonky analytical ones?
Ideally, journalists could ignore the technical stuff and focus on the clear, relevant introduction and conclusion. In fact, the introduction clearly explains context and importance. And there is a policy conclusion that governments should give hospitals equal access to “capital and labour markets for management and doctors.”
But the paper’s real problem for journalists is that they gain no insight into how the theory works, into why the conclusions and policy recommendations might be true. So, they can’t tell if it’s credible or figure out how to assess quality competition in a particular real world context.
Don’t get me wrong: I am not saying journalists can’t understand what matters in this subject, that only someone who reads equations can do that. The authors should have provided what the journalists need in words. That would help lots of us. But it’s a lot of work to do that and presently, in many journals, authors aren’t required to.
By now you are protesting that Klein wasn’t suggesting that he or other journalists read the paper themselves. The plan is to interview the authors. He expects them to explain “in English” and “donate absurd amounts of time” to him. Since he’s prominent, they probably would give him a lot of time. Not all journalists will get the same treatment.
And will the explanations be “in English,” in everyday English? Will the authors provide intuition for why quality competition between hospitals with different costs lowers quality? Will they explain their assumptions clearly enough so that Klein and his readers can figure out the situations this applies in?
They might. Because the authors are health economists, they likely practice doing that, more than most economists or political scientists. But I am not so sure in this case. The section titled “Intuition and policy relevance” is filled with sentences like, “An increase in information, n, is equivalent to a mean preserving contraction in…” At least when the paper was published, the authors had not done the hard work to nail down and explain key insights. They can’t help journalists until they’ve done that.
Some readers may be protesting that Klein would not be interested in a theory paper and that an empirical one would be easier to understand. Both could be true. But empirical papers can also be tough going. Moreover, journalists need theories also to help them make sense of complicated stuff like health care markets.
Klein makes a lot of good points, about the expense of academic journals, the long time to publication and the “samizdat” status of working papers. But by emphasizing these points, he downplays others, which I think are more important.
The biggest problems are the economic and career realities that journalists and academics face, several described in Joshua Rothman’s response to Kristof. Academics do not get hired and promoted for explaining something that has already been discovered. We get hired and promoted for discovering new things.
Meanwhile most journalists are, in Rothman’s words, “moving in a populist direction” and under pressure “to generate traffic.” They are still being tossed about by waves that decimated journalism jobs.
Journalists and academics are, in Klein’s words, “perfectly designed, in strengths and weaknesses, to support each other.” But together we still leave big gaps in what they public needs: “Important, intrinsically complex stuff explained as accessibly as possible.” I know that Klein, Pierre Omidyar, and others are trying to fix this, with their new ventures. I hope they succeed.
But they haven’t even started yet. In the meantime, I hope that academics and academia also try to fill the public’s information needs. For that we need to do more than make our most popular papers available to journalists.
(Paragraphs 3 and 4 were edited on March 15, 2014.)