EARMA Conference Odense 2024

PDF

On fair and consistent evaluations of P2 proposals

LERU recommendations to the EC for fair and consistent evaluation of Horizon Europe collaborative proposals. What RMAs should know about evaluation procedures and instructions.

Conference

EARMA Conference Odense 2024

Format: Oral 30 Minutes

Topic: Operations and Planning

Abstract

The quality of the evaluation procedure and Evaluation Summary Reports (ESRs) of Horizon Europe collaborative proposals should match the high quality that is expected from the R&I funded by the Framework Programme. To assess whether this is the case and suggest possible improvements, a group of LERU universities set up a data-based project to monitor the quality and uniformity of ESRs under Horizon Europe’s Pillar 2. On the basis of our results, we present a first set of recommendations for the European Commission, to (1) optimally align EC expectations for Pillar II research with instructions for prospective applicants and reviewers and (2) ensure uniformity of ESRs across calls and clusters.
The LERU ESR project collected and analysed a set of ESRs (N= 131) from collaborative proposals under the Horizon Europe 2021 Work Programme calls. Our presentation will focus on the data analysis underpinning the EC recommendations, which are highly relevant for RMAs who train and assist researchers in proposal writing.
Firstly, a close reading of evaluation criteria, template guidance and evaluators’ guidelines revealed differences in instructions for prospective applicants and evaluators. These inconsistencies may cause confusion with applicants and evaluators and is at odds with the principle of uniformity and full transparency of evaluation criteria.
Secondly, the data analysis revealed idiosyncratic differences across calls and clusters, conflicting with the expected uniformity of evaluation procedure and criteria. Zooming in on the crosscutting issues specifically, we found mismatches between the EC’s definition of crosscutting issues, e.g. mandatory and recommended open access principles, and reviewers’ understanding of these concepts. The data also show a lower negative correlation coefficient for the number of negative comments on crosscutting issues and the excellence score than is the case for the sections on overall methodology, state of the art and objectives, which may indicate that crosscutting issues weigh less in the final score for excellence. The reviewers’ insufficient understanding of the EC’s definition of some – especially newer – crosscutting issues, must be avoided
Based on our analysis, we argue for two sets of measures to be taken by the EC. First of all, the description of evaluation criteria, the template guidelines and the instructions for the reviewers need to be reviewed and maximally aligned, so proposal writers and reviewers receive consistent and identical instructions. Instructions on some parts of the proposal, notably those relating to some of the newer crosscutting issues, also need further optimization. Secondly, we urge the EC to continue to invest in well-trained and experienced officers monitoring consensus talks and the writing process of ESRs in order to ensure the correct and uniform application of evaluation criteria across clusters and calls.
Our analysis will make both new(er) and more experienced RMAs aware of potential avoidable pitfalls in the proposal preparation phase. It also aims to inform RMAs about the evaluation procedures and instructions given to proposal evaluators. Furthermore, our findings will help RMAs to decide how much time to invest in different crosscutting issues in their training and the proposal writing process.