Grant-writing season is finally over for Australian academics. Actually, grant writing season is never over, but with the deadlines now having passed for the ARC Discovery grants and Discovery Early Career Research Awards and the NH&MRC Project Grants, most academics move from compliance-checking and form-filling to thinking about ideas again.
And maybe doing a bit of teaching and research.
Which is why many researchers would have been intrigued by a piece of correspondence in last week’s Nature, with the no-nonsense title “Australia’s grant system wastes time”. Well, everybody who has ever applied for funding and been rejected knows that. But the staggering part of the story is just how much time it wastes.
Danielle L. Herbert, Adrian G. Barnett and Nicholas Graves from Queensland University of Technology analysed the 2012 NH&MRC Project Grant round of 3,727 proposals. By surveying a sample of applicants they estimated that each grant took 38 person days of work, whereas resubmitted grants (ones that narrowly missed in a previous round and were modified and then submitted in 2012) took 28 person days.
In total, they estimated that that single round of funding – admittedly Australia’s biggest funding round – took up the same amount of time as a single person working for 550 years. And given that 80% of applications were unsuccessful, they estimate over four centuries of effort went unrewarded. This monumental wasted effort, they argued, should be considered by the funding agencies when designing their schemes:
If these [proposals] were more focused, it would reduce preparation costs and could improve the quality of peer review by reducing workloads.
Not all rejections are wasted effort
The idea of over four centuries of wasted effort makes a good headline. And yet not all of the effort involved in writing an ultimately-rejected grant is wasted. You do, as they say, have to be in it to win it. And often the process of writing the grant can lead a researcher to new ideas and research directions that can be explored independent of the funds being applied for.
Seven years ago I applied for a very large grant for which I was decidedly under-qualified. The referees identified that my CV fell short of the lofty standards of the scheme and my idea was somewhat undercooked. But in imagining and planning what I would do with such a large and prestigious grant, I arrived at ambitious and exciting research questions around which to structure my research program. It was the most productive rejection I ever experienced.
And yet it is hard to argue against the fact that a lot of researcher time and effort, not to mention morale, gets wasted every year in granting rounds. Add to the four centuries of wasted time in the 2012 NH&MRC Project round the time spent fruitlessly applying for other schemes, including those of the ARC, and we probably waste over a millenium’s working time every year.
And that is just the applicants and those helping them. Add to that the time spent reviewing the applications.
Costs of applying for funding
Governments spend considerable sums each year supporting a variety of research schemes. And public money should always go to the most deserving candidates and the most interesting, important projects. In order to weigh which projects are most deserving, interesting and important, funding agencies rely on comprehensive applications addressing tightly-specified criteria.
They also rely on the voluntary or nominally-remunerated efforts of expert peer-reviewers who assess the quality of proposals, and panellists who weigh these peer-reviews and make the unenviable decisions about which grants to fund.
I spend about one working week each year assessing grant applications, mostly from the ARC but also from overseas agencies. In my experience about 75% of applications deserve to be funded. The researchers have excellent track records in relation to the opportunities they have had, and the proposed projects involve interesting world-class science with every chance of succeeding.
Informal discussions with former panellists suggest a similar point of view. About 10% of applications are exceptional, leaving no doubt in anybody’s mind that they should be funded. But the next 65% or so of applications are all fundable and the decision about whether each gets funded comes down to tiny differences, including near-negligible differences between the referee’s scores. If your application is part of this two-thirds of the group, a degree of luck can determine the difference between funding and rejection.
Many researchers find themselves on the wrong end of this luck year after year. These are the people whose careers are being eaten away fruitlessly competing for a small piece of a funding pool that shrinks steadily in real terms.
Could the process be streamlined?
We can argue all we like about how much public money the government should be spending, or what kinds of research it should be funding. But surely everybody with an interest in research funding should embrace greater efficiency in the application and assessment process?
What if we could slice a century out of the “time wasted” column without reducing the quality of the funding scheme or the research it supports? Here are a few thoughts about the kinds of things that designers of granting programs seem to gravitate toward and that researchers and peer-reviewers, in my limited experience, equally consider a monumental waste of their time.
Artificial projections of the impact of the research.
Sometimes an application seeks funding for research that will have a particular applied purpose. That is wonderful, and it forms a strong part of the rationale for the research. But some agencies require fundamental or strategic basic research applications to project exactly how the research will have “impact” – despite the fact that the applicants haven’t yet done the research.
The “Pathways to Impact” statement required by the various schemes of the Research Councils of the UK suffers from this problem. As you might expect, these sections get inflated with managerialist newspeak.
As an overseas referee asked to comment on the scientific merits of grant applications and applicants, I grow incensed at the waste of my time and the applicant’s time making fictionalised assurances about the likely ways the just-conceived project will change society and strengthen the economy. In my limited experience of UK schemes, this section has nothing to do with the quality or likely success of the research itself.
Institutional Commitment
In ARC schemes, applicants are now required to work with their host institution to prepare a statement about the research environment or the commitment of the institution. Remember that only certain organisations are eligible to host ARC grants and fellowships. And yet applications require a 2-page outline, signed by the Deputy Vice-Chancellor (Research) of such eligible institutions, concerning (in this case from the DECRA guidelines):
- the fit between the application and the existing and/or emerging research strengths of the administering organisation,
- the support the applicant will get from the organisation and
- the opportunities the fellow will have to become an independent researcher who is “competitive for research and/or research and teaching pathways at the Administering
Organisation” during and after the project.
I have read about a hundred of these statements for various schemes. They universally tend toward bland corporate verbiage about the institution’s “Strategic Plan” and “Research Strengths”. Amazingly every project falls into a current or emerging strength. Likewise every institution promises to back the researcher as though they were already short-listed for a Nobel prize. And they promise to turn out a well-rounded and highly competitive researcher after the three-year period.
DECRA Fellowships constitute some of the toughest money to win – worldwide. The best of the best PhD graduates compete with their burnished CV’s and scintillating ideas. To suggest that somebody could win one without already being highly competitive for a research career is patronising in the extreme.
Likewise, asking institutions to discuss how committed they are to a Discovery or NH&MRC grant is like asking “Do you want your research to be funded?”. There is only one correct answer.
And yet sometimes as much as 10% of the mark used to score a proposal is based on this section. A section that reveals little about the quality of the applicant or the project. A reviewer wishing to give a very good mark might not know whether to score this section an 8 or a 10. And yet the two-point difference can determine whether an application ends up in the funded 20% or well outside the fundable range.
I’ll probably make myself very unpopular with research administrators for saying this, but if I were in charge of funding schemes, I’d definitely get rid of this section. The time it takes to write, check and assess it cannot be justified.
How would you streamline the granting process?
I’d love to hear from researchers and research administrators, either in the comments below or via Twitter (@Brooks_Rob), how best grant application and assessment processes could be streamlined without compromising the quality of the applications funded. Perhaps we can save a few centuries’ effort.
Rob Brooks does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations.
This article was originally published at The Conversation.
Read the original article.