Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published July 2023 | Published
Journal Article Open

Cite-seeing and reviewing: A study on citation bias in peer review

Abstract

Citations play an important role in researchers' careers as a key factor in evaluation of scientific impact. Many anecdotes advice authors to exploit this fact and cite prospective reviewers to try obtaining a more positive evaluation for their submission. In this work, we investigate if such a citation bias actually exists: Does the citation of a reviewer's own work in a submission cause them to be positively biased towards the submission? In conjunction with the review process of two flagship conferences in machine learning and algorithmic economics, we execute an observational study to test for citation bias in peer review. In our analysis, we carefully account for various confounding factors such as paper quality and reviewer expertise, and apply different modeling techniques to alleviate concerns regarding the model mismatch. Overall, our analysis involves 1,314 papers and 1,717 reviewers and detects citation bias in both venues we consider. In terms of the effect size, by citing a reviewer's work, a submission has a non-trivial chance of getting a higher score from the reviewer: an expected increase in the score is approximately 0.23 on a 5-point Likert item. For reference, a one-point increase of a score by a single reviewer improves the position of a submission by 11% on average.

Copyright and License

© 2023 Stelmakh et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Acknowledgement

We appreciate the efforts of all reviewers involved in the review process of ICML 2020 and EC 2021. We thank Valerie Ventura for useful comments on the design of our analysis procedure.

Funding

NSF CAREER award 1942124 was awarded to Nihar Shah (https://www.nsf.gov/awardsearch/showAward?AWD_ID=1942124&HistoricalAwards=false) J.P. Morgan AI research fellowship was awarded to Charvi Rastogi (https://www.jpmorgan.com/technology/artificial-intelligence/research-awards/phd-fellowship-2021) The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. There was no additional external funding received for this study.

Data Availability

We note that the release of experimental data would compromise the reviewers' confidentiality. Thus, following prior works that empirically analyze the conference peer-review process (Tomkins et al., 2017; Shah et al., 2018; Lawrence and Cortes, 2014), and complying with the conference's policy, we are unable to release the data and code from the experiment.

Funding

This work was supported by NSF CAREER award 1942124. Charvi Rastogi was partially supported by a J.P. Morgan AI research fellowship.

NSF CAREER award 1942124 was awarded to Nihar Shah (https://www.nsf.gov/awardsearch/showAward?AWD_ID=1942124&HistoricalAwards=f alse)

J.P. Morgan AI research fellowship was awarded to Charvi Rastogi (https://www.jpmorgan.com/technology/artificial-intelligence/research-awards/phd-fellowship-2021)

The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Conflict of Interest

The authors have declared that no competing interests exist.

Files

pone.0283980.pdf
Files (2.0 MB)
Name Size Download all
md5:6c68db2775305f0d7630427295953453
843.5 kB Preview Download
md5:eebc3c8a7d9f0c4633e4f7fac9504d8d
1.1 MB Preview Download

Additional details

Created:
November 10, 2023
Modified:
November 10, 2023