Journal ranking

Source: Wikipedia, the free encyclopedia.

Journal ranking is widely used in academic circles in the evaluation of an academic journal's impact and quality. Journal rankings are intended to reflect the place of a journal within its field, the relative difficulty of being published in that journal, and the prestige associated with it. They have been introduced as official research evaluation tools in several countries.

Measures

Traditionally, journal ranking "measures" or evaluations have been provided simply through institutional lists established by academic leaders or through a committee vote. These approaches have been notoriously politicized and inaccurate reflections of actual prestige and quality, as they would often reflect the biases and personal career objectives of those involved in ranking the journals; also causing the problem of highly disparate evaluations across institutions.[1] Consequently, many institutions have required external sources of evaluation of journal quality. The traditional approach here has been through surveys of leading academics in a given field, but this approach too has potential for bias, though not as profound as that seen with institution-generated lists.[2] Consequently, governments, institutions, and leaders in scientometric research have turned to a litany of observed bibliometric measures on the journal level that can be used as surrogates for quality and thus eliminate the need for subjective assessment.[1]

Consequently, several journal-level metrics have been proposed, most

citation-based
:

  • Impact factor and CiteScore – reflecting the average number of citations to articles published in science and social science journals.
  • SCImago Journal Rank – a measure of scientific influence of scholarly journals that accounts for both the number of citations received by a journal and the importance or prestige of the journals where such citations come from.
  • h-index – usually used as a measure of scientific productivity and the scientific impact of an individual scientist, but can also be used to rank journals.
    • h5-index – this metric, calculated and released by Google Scholar, is based on the h-index of all articles published in a given journal in the last five years.[3]
  • Expert survey – a score reflecting the overall quality or contribution of a journal is based on the results of the survey of active field researchers, practitioners and students (i.e., actual journal contributors or readers), who rank each journal based on specific criteria.[4]
  • Top quartile citation count (TQCC) – reflecting the number of citations accrued by the paper that resides at the top quartile (the 75th percentile) of a journal's articles when sorted by citation counts; for example, when a journal published 100 papers, the 25th most-cited paper's citation count is the TQCC.[5]
  • Publication power approach (PPA) – the ranking position of each journal is based on the actual publishing behavior of leading tenured academics over an extended time period. As such, the journal's ranking position reflects the frequency at which these scholars published their articles in this journal.[6][7]
  • Altmetrics – rate journals based on scholarly references added to academic social media sites.[8]
  • diamScore – a measure of scientific influence of academic journals based on recursive citation weighting and the pairwise comparisons between journals.[9]
  • Source normalized impact per paper (SNIP) – a factor released in 2012 by Elsevier based on Scopus to estimate impact.[10] The measure is calculated as SNIP=RIP/(R/M), where RIP=raw impact per paper, R = citation potential and M = median database citation potential.[11]
  • PageRank – in 1976 a recursive impact factor that gives citations from journals with high impact greater weight than citations from low-impact journals was proposed.[12] Such a recursive impact factor resembles Google's PageRank algorithm, though the original paper uses a "trade balance" approach in which journals score highest when they are often cited but rarely cite other journals; several scholars have proposed related approaches.[13][14][15][16]
    • Eigenfactor is another PageRank-type measure of journal influence,[17] with rankings freely available online.[18]

Discussion

Negative consequences of rankings are generally well-documented and relate to the performativity of using journal rankings for performance measurement purposes.[19][20] Studies of methodological quality and reliability have found that "reliability of published research works in several fields may be decreasing with increasing journal rank",[21] contrary to widespread expectations.[22]

For example, McKinnon (2017) has analyzed how the ABS-AJG ranking, which in spite of its methodological shortcomings is widely accepted in British business schools, has had negative consequences for the transportation and logistics management disciplines.[23] A study published in 2021 compared the Impact Factor, Eigenfactor Score, SCImago Journal & Country Rank and the Source Normalized Impact per Paper, in journals related to Pharmacy, Toxicology and Biochemistry. It discovered there was "a moderate to high and significant correlation" between them.[24]

Thousands of universities and research bodies issued official statements denouncing the idea that research quality can be measured based on the uni-dimensional scale of a journal ranking, most notably by signing the San Francisco Declaration on Research Assessment (DORA), which asked "not [to] use journal-based metrics ... as a surrogate measure of the quality of individual research articles, to assess an individual scientist's contributions, or in hiring, promotion, or funding decisions".[25] The Community for Responsible Research in Business Management (cRRBM) asks whether "even the academy is being served when faculty members are valued for the quantity and placement of their articles, not for the benefit their research can have for the world".[26] Some academic disciplines such as management exhibit a journal ranking lists paradox: on the one hand, researchers are aware of the numerous limitations of ranking lists and their deleterious impact on scientific progress; on the other hand, they generally find journal ranking lists to be useful and employ them, in particular, when the use of ranking lists is not mandated by their institutions.[27]

National rankings

Several national and international rankings of journals exist, e.g.:

They have been introduced as official research evaluation tools in several countries.[40]

See also

References

  1. ^
    SSRN 2186798. Also, see YouTube video narrative of this paper at:  Yukon
  2. .
  3. .
  4. .
  5. ^ "About OOIR: Journal-level data". Retrieved 2023-03-14.
  6. .
  7. .
  8. ISBN 978-3-64240-500-6. {{cite book}}: |work= ignored (help
    )
  9. .
  10. ^ "Elsevier Announces Enhanced Journal Metrics SNIP and SJR Now Available in Scopus" (Press release). Elsevier. Retrieved 2014-07-27.
  11. S2CID 10644946
    .
  12. .
  13. .
  14. .
  15. .
  16. S2CID 3115544. {{cite book}}: |journal= ignored (help
    )
  17. .
  18. ^ West, Jevin Darwin. "Eigenfactor.org". Eigenfactor. Retrieved 2014-05-18.
  19. S2CID 113406795
    .
  20. .
  21. .
  22. . One might expect, therefore, that a high JIF factor indicates a higher standard of interest, accuracy and reliability of papers published therein. This is sometimes true but unfortunately is certainly not always the case (Brembs 2018, 2019). Thus, Björn Brembs (2019) concluded: "There is a growing body of evidence against our subjective notion of more prestigious journals publishing 'better' science. In fact, the most prestigious journals may be publishing the least reliable science."
  23. .
  24. .
  25. ^ "Home". DORA.
  26. ^ Glick, William; Tsui, Anne; Davis, Gerald (2018-05-02). Cutler, Dave (ed.). "The Moral Dilemma to Business Research". BizEd Magazine. Archived from the original on 2018-05-07.
  27. S2CID 266921800
    .
  28. ^ "Australian Research Council ranking of journals worldwide". 2011-06-12. Archived from the original on 2011-06-12.
  29. S2CID 255013801
    .
  30. ^ "CORE Rankings Portal". core.edu.au. Retrieved 2022-12-27.
  31. ^ "Uddannelses- og Forskningsministeriet".
  32. ^ "Julkaisufoorumi". December 2023.
  33. ^ "Search in Norwegian List | Norwegian Register".
  34. ^ "Rating of Scientific Journals – ANVUR – Agenzia Nazionale di Valutazione del Sistema Universitario e della Ricerca".
  35. ^ "Chartered Association of Business Schools – Academic Journal Guide".
  36. ^ "List of HEC Recognized Journals".
  37. ^ "NAAS Score of Science Journals" (PDF). National Academy of Agricultural Sciences. 2022-01-01. Archived (PDF) from the original on 2023-03-15.
  38. ^ "Polish Ministry of Higher Education and Science (2019)". www.bip.nauka.gov.pl. Retrieved 2019-10-12.
  39. ^ "Polish Ministry of Higher Education and Science (2021)". www.bip.nauka.gov.pl. Retrieved 2021-02-09.
  40. S2CID 53387400
    .