Rankings
How we count and rank
Counting Method
Each unique published paper is counted once per organization. If a paper has multiple authors from the same institution, it still counts as one paper for that institution. Organizations are ranked by their total distinct paper count. Ties are broken alphabetically by organization name.
Multiple Affiliations
Authors often list more than one affiliation on a paper — for example, a primary department and a research center, or joint appointments across institutions. Each affiliation is parsed independently, and every distinct organization mentioned gets one paper added to its count. For example, “Booth School of Business, University of Chicago” adds one paper to both the Booth School (for AACSB business school rankings) and the University of Chicago (for university rankings).
University Rankings
Universities are ranked by the total number of distinct papers with at least one author affiliated with that institution. Each author's affiliation is matched to an organization record using name lookup and the ROR API.
Business School Rankings
AACSB-accredited business schools are ranked by the number of papers published by their affiliated authors. This uses the AACSB school name matched to each author's affiliation (e.g. "Wharton School", "Booth School of Business").
- Each paper counts once per business school, even if multiple authors are from the same school.
- The parent university is shown alongside the school name for context.
- Only affiliations matched to AACSB-accredited schools are included in these rankings.
Author Rankings
Authors are ranked by their total number of distinct papers. When the same researcher appears under different name variations or across multiple institutions, our deduplication pipeline merges these records into a single canonical author using ORCID identifiers, email addresses, name similarity, organizational overlap, and co-author network signals.
Data Quality
Rankings are only as good as the underlying data. Our pipeline crawls article metadata from publisher websites and cross-references it with CrossRef, ORCID, and ROR APIs, but errors can still occur at any stage — affiliations may be parsed incorrectly, authors may not be properly deduplicated, or organization names may be matched to the wrong institution.
We continuously work to improve data quality, but with thousands of articles across many journals, some issues inevitably slip through.
If you spot an error, please use the “Report Issue” button on any article, author, or ranking page to let us know. Community reports help us identify and fix data issues faster, improving the accuracy of rankings for everyone.