Advertisement

A Primer on BRIMR

Understanding the Rankings of NIH Support from the Blue Ridge Institute for Medical Research
Published:December 23, 2021DOI:https://doi.org/10.1016/j.ajpath.2021.12.004
      Every year since 2006, the Blue Ridge Institute for Medical Research (BRIMR), an independent nonprofit organization, has published rankings of institutions, departments, and investigators based on the funding they receive from the US National Institutes of Health (NIH). The BRIMR standings are widely cited as a measure of scientific vitality, and have been used to examine trends in funding to individual grantees and to academic specialties.
      • Noble P.
      • Ten Eyck P.
      • Roskoski Jr., R.
      • Jackson J.B.
      NIH funding trends to US medical schools from 2009 to 2018.
      • Chandrakantan A.
      • Adler A.C.
      • Stayer S.
      • Roth S.
      National Institutes of Health-funded anesthesiology research and anesthesiology physician-scientists: trends, promises, and concerns.
      • Lonser R.R.
      • Smith L.G.F.
      • Tennekoon M.
      • Rezai-Zadeh K.P.
      • Ojemann J.G.
      • Korn S.J.
      Creation of a comprehensive training and career development approach to increase the number of neurosurgeons supported by National Institutes of Health funding.
      • Lewit R.A.
      • Black C.M.
      • Camp L.
      • Brott N.
      • Cottrell J.M.
      • Herman T.
      • Holden K.W.
      • Matthews L.
      • Schneider E.
      • Goldstein A.M.
      • Matthews J.B.
      • Emamaullee J.
      • Gosain A.
      Association of surgeon representation on NIH study sections with receipt of funding by surgeon-scientists.
      • Feinstein M.M.
      • Niforatos J.D.
      • Mosteller L.
      • Chelnick D.
      • Raza S.
      • Otteson T.
      Association of doximity ranking and residency program characteristics across 16 specialty training programs.
      • Conte M.L.
      • Schnell S.
      • Ettinger A.S.
      • Omary M.B.
      Trends in NIH-supported career development funding: implications for institutions, trainees, and the future research workforce.
      • Patel P.A.
      • Gopali R.
      • Reddy A.
      • Patel K.K.
      National Institutes of Health funding trends to ophthalmology departments at U.S. medical schools.
      Though focused on one prosaic, pecuniary aspect of the nation's biomedical research enterprise, they have the advantage of being quantitative and objective, insofar as the competition for NIH funds is overseen by expert peer reviewers from institutions across the country. In advance of the release of BRIMR's rankings for 2021, it seems timely to highlight key factors underpinning the calculations that may not be universally recognized, and to suggest practical steps grantees can take to ensure that their accomplishments are appropriately recorded.
      BRIMR's methodology follows that of the NIH, which for many years published its own rankings, but stopped doing so after 2005. The standings reflect only NIH funds awarded during a given federal fiscal year, which ends on September 30. They are based on data compiled and published by the NIH in a master spreadsheet called “Worldwide” that is released on the NIH Research Portfolio Online Reporting Tool (RePORT) each year, typically in late December. Serving as a year-end synopsis of NIH's extramural awards, the Worldwide file lists such metadata as the grantee institution, school, and department; the principal investigator (PI) or program director; the project title and identification number; and the dollar amount of funding for each award. The most recent file, for fiscal year 2020, detailed nearly $35 billion awarded through 62,585 grants or contracts to more than 41,000 PIs in the United States and abroad. Of that total, $22 billion (62%) was awarded to 516 professional schools and hospitals that were included in BRIMR's calculations (www.brimr.org/NIH_Awards/2020, last accessed December 10, 2021). NIH neither participates in nor endorses the rankings, which are solely BRIMR's responsibility; calling them “NIH rankings” can be misleading in that regard.
      For each award in the Worldwide dataset, NIH makes two key determinations that guide BRIMR's subsequent analyses. First, NIH assigns the grantee organization to one of 33 thematic categories, which it calls “NIH MC Combining Names”; BRIMR relies on these assignments when it compares organizations that fall within seven of those categories (ie, hospitals, or schools of dentistry, medicine, nursing, pharmacy, public health, or veterinary medicine) while disregarding all others (eg, graduate schools). Second, faced with a profusion of department names and missions, NIH assigns each funded department to one of 46 “NIH Department Combining Names,” most of which correspond to traditional basic-science or clinical disciplines such as biochemistry or dermatology (Table 1). BRIMR relies on the latter taxonomy when it ranks medical-school departments within 27 of the 46 categories. Medical-school PIs, in turn, are ranked by comparison to all others who hold awards under the same department combining name.
      Table 1NIH-Designated Categories (Department Combining Names) of Medical-School Departments
      Ranked by BRIMR
       Basic SciencesAnatomy/Cell Biology

      Genetics

      Pharmacology
      Biochemistry

      Microbiology/Immun/Virology

      Physiology
      Biomedical Engineering

      Neurosciences
       ClinicalAnesthesiology

      Family Medicine

      Neurosurgery

      Ophthalmology

      Pediatrics

      Public Health & Prev Medicine

      Urology
      Dermatology

      Internal Medicine/Medicine

      Obstetrics & Gynecology

      Orthopedics

      Physical Medicine & Rehab

      Radiation-Diagnostic/Oncology
      Emergency Medicine

      Neurology

      Otolaryngology

      Pathology

      Psychiatry

      Surgery
      Not Ranked by BRIMR
       Basic SciencesBiology

      Chemistry

      Physics

      Zoology
      Biophysics

      Engineering (All Types)

      Psychology
      Biostatistics & Other Math Sci

      Nutrition

      Social Sciences
       ClinicalDentistryPlastic SurgeryVeterinary Sciences
       OtherAdministration

      Other Basic Sciences
      Miscellaneous

      Other Clinical Sciences
      None

      Other Health Professions
      The NIH uses 46 standard “Department Combining Names” to categorize medical-school departments, their principal investigators, and their awards. BRIMR ranks departments within 27 of these NIH-assigned categories (Source: report.nih.gov/system/files/inline-files/dept_name.xls, last accessed December 10, 2021.
      Using the Worldwide file as our main data source preserves BRIMR's impartiality and the concordance of the results with federal records, but it imposes constraints as well. Most consequentially, NIH credits all funding for each award to a single lead institution, department, and PI, even in the case of large multi-institutional consortia, program project grants, and multi-PI awards. Accordingly, the dollar amounts that NIH lists and BRIMR ranks do not necessarily reflect funds received by a given organization, school, or investigator. It may be best to think of them as signifying the totality of NIH support for individual, collaborative, or multi-institutional projects that are led by the named grantee. The Worldwide file distinguishes nine types of awards, including research project grants, center support, and training awards, which are considered equally in BRIMR's rankings. Administrative supplements to these awards, which are also included, substantially affected the 2020 standings amid the coronavirus pandemic, and are likely to do so again in 2021. Research and development contracts, which totaled about $2.0 billion in 2020, are an exception. They are rarely ascribed to a specific medical school or department in the Worldwide file, so BRIMR excludes them from its rankings and instead lists them in a separate table.
      The varied administrative structures and practices of grantee organizations also can affect the categories under which awards are counted. Universities differ, for example, in whether disciplines like family medicine or neuroscience are departments of their own or subsumed into larger ones, and in whether public health is constituted as a medical-school department, an independent school, or both. Departments whose names encompass two rankable disciplines (eg, pathology and cell biology) must be classified as one or the other, and some disciplines (such as molecular medicine, biomedical informatics, or cancer biology) may not readily fall into any available category at all. Institutional awards such as those supporting cancer centers may be credited either to an individual PI and department, or to an administrative entity (such as a Dean's office) that is not ranked. Many medical schools, moreover, have faculty PIs based at independent hospitals or other affiliates that administer, and hence are credited with those investigators' awards. Some of the nation's most prominent multispecialty teaching hospitals, pediatric centers, and cancer or neuropsychiatric institutes administer NIH grants independently in this way, which necessarily shifts credit away from the medical schools and departments whose faculty practice there. BRIMR respects such variation as reflecting choices made locally by the institutions themselves and are rarely changed.
      BRIMR makes many adjustments of its own to the dataset, most of them aimed at facilitating computerized analyses and standardizing terminology. The total funding of an award, however, is never altered. When BRIMR releases its conclusions online, both the NIH source file and BRIMR's final revised dataset are available for comparison, so that all data and revisions underlying the rankings are transparently accessible for public review.
      How can grantees help? Be systematic and minimize variation when naming your organization, its schools, departments, and PIs in grant submissions, so awards will be credited appropriately. Carefully review BRIMR's draft revisions of the Worldwide dataset that are posted on our website for a four-week period in January and February, and notify us promptly of any needed corrections. Be especially attentive to four data elements that determine the standings, ie, the organization name, department combining name, PI name, and total funding. Awards that list a PI's name in alternative forms (eg, with or without a middle initial), in more than one home department, or in a center or institute rather than a department, could risk misattribution. Omissions are surprisingly common. Even after incorporating grantee input, 902 noncontract awards to medical schools in 2020, totaling $567 million, listed no departmental home and could not be included in any of BRIMR's departmental or PI calculations. The reasons for this undoubtedly vary, but they warrant attention by grantee institutions. Searching the NIH database with its RePORT Expenditures and Reports (RePORTER) tool can be helpful for cross-checking. However, unless search parameters are set appropriately, RePORTER may show data from multiple fiscal years, sub-awards, carry-forwards, no-cost extensions, or funding sources other than NIH, none of which will be included in the Worldwide file or used in BRIMR's rankings. The goal is to ensure that all applicable awards are fully and accurately credited. Instructions for submitting corrections are presented on BRIMR's website each year, along with the deadline by which these must be received.
      Scientific merit cannot be measured in grant dollars nor expressed in ordinal scores. Nevertheless, BRIMR's rankings continue to feature prominently in academic newsletters, websites, strategic plans, performance reviews, recruitment materials, and hallway conversations, which testifies to their appeal. For those who prioritize discovery, the BRIMR standings offer objective national benchmarks, a shorthand metric for tracking progress, a yearly celebration of NIH's immense scope and impact, or simply a way to keep score. We welcome input from all stakeholders to help keep these rankings as valid, useful, and accurate as possible.

      References

        • Noble P.
        • Ten Eyck P.
        • Roskoski Jr., R.
        • Jackson J.B.
        NIH funding trends to US medical schools from 2009 to 2018.
        PLoS One. 2020; 15: e0233367
        • Chandrakantan A.
        • Adler A.C.
        • Stayer S.
        • Roth S.
        National Institutes of Health-funded anesthesiology research and anesthesiology physician-scientists: trends, promises, and concerns.
        Anesth Analg. 2019; 129: 1761-1766
        • Lonser R.R.
        • Smith L.G.F.
        • Tennekoon M.
        • Rezai-Zadeh K.P.
        • Ojemann J.G.
        • Korn S.J.
        Creation of a comprehensive training and career development approach to increase the number of neurosurgeons supported by National Institutes of Health funding.
        J Neurosurg. 2021; 135: 176-184
        • Lewit R.A.
        • Black C.M.
        • Camp L.
        • Brott N.
        • Cottrell J.M.
        • Herman T.
        • Holden K.W.
        • Matthews L.
        • Schneider E.
        • Goldstein A.M.
        • Matthews J.B.
        • Emamaullee J.
        • Gosain A.
        Association of surgeon representation on NIH study sections with receipt of funding by surgeon-scientists.
        Ann Surg. 2021; 273: 1042-1048
        • Feinstein M.M.
        • Niforatos J.D.
        • Mosteller L.
        • Chelnick D.
        • Raza S.
        • Otteson T.
        Association of doximity ranking and residency program characteristics across 16 specialty training programs.
        J Grad Med Educ. 2019; 11: 580-584
        • Conte M.L.
        • Schnell S.
        • Ettinger A.S.
        • Omary M.B.
        Trends in NIH-supported career development funding: implications for institutions, trainees, and the future research workforce.
        JCI Insight. 2020; 5: e142817
        • Patel P.A.
        • Gopali R.
        • Reddy A.
        • Patel K.K.
        National Institutes of Health funding trends to ophthalmology departments at U.S. medical schools.
        Semin Ophthalmol. 2021; 1: 1-7