“In science … novelty emerges only with difficulty, manifested by resistance.” So wrote Thomas Kuhn in his landmark The Structure of Scientific Revolutions. Some studies have backed up Kuhn’s insight that the scientific establishment tends to be cool toward published ideas that challenge a discipline’s entrenched paradigms. But there have been few examinations of how research funding agencies have treated scientists with novel ideas; their yea-or-nay decisions largely determine whether unorthodox concepts can take root and grow at all.
Now, a new study finds that at least one funder wasn’t wild about researchers with records of pursuing out-of-the-box ideas: those who applied to Sinergia, a grants program at the Swiss National Science Foundation. Over a 5-year period, scientists who had a record of publishing novel papers received lower scores, and were 31% less likely to receive funding.
Although limited to a single funding program, the new study supports perceptions that grant reviewers are biased against novelty, says Jian Wang of KU Leuven, who was not involved in the research. He says the findings are consistent with research he co-authored that found a similar coolness toward unconventional ideas in grantmaking by the European Research Council. Reviewers may shy away from novel proposals because they fear the projects won’t produce useful results. One big question, Wang adds, is: “How do we select reviewers or structure the selection process to mitigate bias against novelty?”
Funding agencies ordinarily keep the identities of grant applicants secret. But by pledging not to publish identifying details, the authors of the new study were allowed to examine funding decisions on 255 applications, involving 775 scientists, that were submitted to Sinergia from 2008 to 2012.
Overall, Sinergia funded just under 50% of the applications, the authors report in the October issue of Science and Public Policy. The authors did not rate the novelty of the proposals themselves, calling that a question for future research. Instead, they scored the applicants’ track record of novelty using an existing method that examines unusual combinations of journal titles referenced by the scientists’ published research.
Their analysis indicated the track records carried weight with reviewers: The grant awards revealed a bias against applications submitted by teams in which at least two-thirds of the scientists had high novelty ratings. Other characteristics of the applicant teams, such as whether they were all Swiss or larger than others, did not explain the outcomes—although applicants without other existing grants were also somewhat disfavored, note the study’s three co-authors, economists Charles Ayoubi of Harvard University, Michele Pezzoni of the French national research agency CNRS, and Fabiana Visentin of Maastricht University.
Why did grant reviewers take such a dim view of the potentially innovative proposals? One possibility, the authors suggest, is that the applicants’ previous scholarly findings had attracted few citations. Research has shown leading citation metrics tend to disadvantage papers containing novel results. Compared with similar but conventional papers, they tend to be published in lower profile journals and receive fewer citations within 2 years after publication—even though they often prove themselves in the long run, ending up with top citation rankings and inspiring follow-up studies.
For scientists rejected by Sinergia, there may have been a silver lining: All applicants tended to publish more scholarly articles for at least 5 years after submitting the proposals, regardless of whether they won funding, compared with similar scientists who did not submit, Ayoubi and his colleagues found in a separate study of the same data set. They speculate that the application process encouraged applicants to forge connections and launch productive collaborations with other scientists.
In 2016, Sinergia moved to encourage novelty by explicitly soliciting proposals for “breakthrough” and interdisciplinary research. Despite the foundation’s strong interest in such proposals, “We admit that it is notoriously difficult to identify novelty and the potential for breakthrough before the research has been carried out,” says Anne Jorstad, data team head at the Swiss foundation. She says it’s too early to judge whether that change in criteria has paid off in a warmer embrace of unorthodox research.