
The public appetite for more information about Covid-19 is understandably insatiable. Social scientists have been quick to respond. They are writing papers at a record pace, and academic journals are expediting the review process so that these new, exciting results can be published in a timely and newsworthy manner. While I understand the impulse, the rush to publish findings quickly in the midst of the crisis does little for the public and harms the discipline of social science.
Even in normal times, social science suffers from a host of pathologies. Results reported in our leading scientific journals are often unreliable because researchers can be careless, they might selectively report their results, and career incentives could lead them to publish as many exciting results as possible, regardless of validity.
A global crisis only exacerbates these problems. Rushing to publish timely results means more carelessness, and the promise of favourable news coverage in a time of crisis further distorts incentives.
I am especially concerned about three trends among social scientists during the Covid-19 pandemic, the first of which is that many of them appear to be rushing their work. Good science takes time. Researchers often spend months collecting, organising and double-checking their data. They spend more months presenting their findings and gathering feedback from colleagues before they publicly release their results. But many social scientists are already releasing and publicising studies using Covid-19 data that was collected just days ago, and they are often failing to apply the same level of rigour that they normally would.
For example, several recent studies have asked whether partisan attitudes affect social distancing. One challenge is that it’s difficult to measure social distancing. In one recent study, survey respondents were asked to self-report their social distancing, but people often misreport their beliefs and behaviors in
political surveys. Another study used GPS data to measure visits to places of interest like restaurants and movie theatres, but this seems like a poor test of
social distancing at a time when many such places are closed (especially in more Democratic places). A second challenge is that even if we find a clear difference between Democratic and Republican behaviour, it’s difficult to say whether this difference is explained by political attitudes or other factors. Democrats tend to live in more urban places, where the pandemic has been more severe and local governments have implemented more stringent policies and guidelines; neither of these studies accounted for these alternative explanations.
Another recent study investigated the extent to which watching “Hannity†versus “Tucker Carlson Tonight†may have increased the spread of Covid-19. This is the kind of study that might make one skeptical in normal times. An extra concern now is that the paper was likely written in just a few days. Although the authors write that they used variation in sunset times to estimate the effect of watching “Hannity,†a closer reading suggests that they’re mostly using variation in how much people in different media markets watch television and how much Fox News they watch. Maybe conservative commentators like Sean Hannity have exacerbated the spread of Covid-19, but it’s dangerous for social scientists to publicise these kinds of results before they have been carefully vetted.
Not only are social scientists rushing to write these studies, but academic journals are also rushing to publish them. An editor might typically vet submitted papers, then select experts in the field to review these papers. The reviewers might read a paper carefully and provide feedback. And then the authors would have the opportunity to revise their paper in response to that feedback, and the process would repeat (often multiple times). But for papers related to Covid-19, the typical process is being streamlined. I was recently asked to review three such papers for a scientific journal. Although an editor might normally give me six weeks to complete one review, I was asked to complete three reviews in just one week. Similarly, a political science journal asked me to referee a paper for a rapid-review series related to Covid-19. The editor explicitly stated that my review would not require the detail or length of a normal review; instead, they wanted a simple “accept†or “reject†within five days.
One possible reason for rushing science in the midst of a crisis is that the benefits of quickly getting new information to the
public and to policy makers outweigh the potential costs of
giving them less reliable information. Perhaps one could make this argument for those studying how to cure or prevent the spread of Covid-19. But most of the work being done by social scientists on Covid-19, while interesting and important, is not urgent. Understanding how political attitudes affect social distancing may be relevant for understanding political psychology, for example, and it might even help us design better solutions in a future pandemic, but it doesn’t significantly benefit society to have this information today.
The second troubling trend is the temptation of social scientists to speak outside their areas of expertise. There is so much we don’t know about Covid-19 and so much uncertainty about how the pandemic will play out that many are tempted to speculate and conduct their own analyses.
—Bloomberg