Tools for evaluating journals and ranking them

Common methods for rating journals are based on statistical measures that examine the citation frequency of each of them in later texts. It is assumed that the more times an article is cited, the greater its impact on the research discourse.

 

Advantages:

The ranking allows researchers to choose the journals with a high impact in their field of research.
The ranking serves as an important tool for evaluating the impact and relevance of academic research.
Citation analysis helps support research and identify leading scientists and trace the connections between their research groups.
Reflects the leading journals in a certain research field.
The success of the journal contributes to its publication and raising the prestige of the publisher.
Allows the end user to select the highest quality articles
The rating encourages competition between journals, which also creates competition between the authors of the articles on the quality of the content.
Influences decision-making regarding building the collection in light of budget and space limitations.

 

Disadvantages:

Self-citation problem: Self-citations by researchers artificially
inflates the metrics.

Time factor. Rating methods based on citation counting give a distinct advantage to older journals. It can be assumed that between two journals of equal quality, the old journal will receive a greater number of citations

Quality problem. Many critique articles may have been written following the publication of a bad article; hence the level of citation does not indicate the quality. And vice versa: a mediocre journal that publishes a quality article will get more citation and an improvement in its ranking.

A rating method based on citations counting does not express possible changes in the level of a given journal.

Counting-based methods give an advantage to journals that publish more articles. The higher the frequency of publication, the greater the number of citations in the journal.

Reliable and detailed information about the journals distributed all over the world see the ULRICH'S bibliographic database

There are several impact indicators, from different databases:

Journal Impact Factor (JIF), (Q) Quartiles - from the Journal Citation Reports (JCR) database based on publications from the WOS database

 

 CiteScore, Scimago Journal rank (SJR) - based on publications in the SCOPUS database

 

Google Scholar Metrics - metrics based on the Google Scholar database.

 

JCR – Journal Citation Reports

The JCR database led by Thomson Reuters allows the evaluation of leading journals of over 3300 publishers in various fields from 1997 onwards, whose articles appear in the Web of Science database. The database is updated once a year.
In this database you can Identify leading journals using topic categories, author names and research organization names. See the current profile of the journal and its history.
Compare competing journals from the same category with maximum IF.
The index for evaluating journals in the JCR database is statistical and includes three tools for evaluation: Impact Factor, Eigenfactor, and calculation of percentages of articles in cited publications

 

Impact Factor (=IF):

The most accepted and oldest index for evaluating journals.

IF is a relative calculation of the citations of each published article; The higher the IF, the higher the value of the journal. The basis of the formula is the number of times articles from the journal were cited in a certain year, divided by the total number of cited articles published in the journal that year. The impact factor of a particular year checks the average of citations to articles published in the two previous years (the impact factor of a particular journal for 2020 will be the average of citations to articles published in the journal in 2018-2019).

The index depends on the field and is not absolute and varies greatly between the different fields of science. According to several studies, it appears that journals that publish articles in open access (Open Access) receive citations at a higher rate.

Eigenfactor Score- This measure checks the impact of a journal according to citation data five years back while examining the source from which the citations came. This index gives greater weight to citations in leading journals.
 Immediacy Index -  reference to the number of citations in the same year the journal was published (immediate impact). This data can be found on the JCR platform.

 

SCOPUS - Elsevier API

The Scopus database led by Elsevier is a key to peer review research articles and referees from over 5000 publishers of research journals, in addition to books, conferences and patents in the fields of technology, exact sciences, medicine, social sciences and humanities. The database is updated every day.
The ratings enable to evaluate authors, locate leading journals using subject categories. Receive a journal profile and history. and to compare competing journals from different categories. This review will focus on journal evaluation functions and author evaluation:

Additional indices based on the Scopus database:

CiteScore rank & trend - the relative ranking of the journal in the research field to which it belongs.

SJR (SCImago Journal Rank) – a rating that also refers to the prestige of the citing journal (gives priority to journals considered to be of higher quality). The calculation was made over a period of 4 years. The SJR data is free and open to all.

SNIP (Source Normalized Impact per Paper) - impact index of a single article within the journal. The average of the article's citations relative to all citations in the field it deals with.

 

Google - Google Scholar Metrics 

Google Scholar's top journals ranking, built from a list of the 100 most cited journals in each field of knowledge. The top journals can be seen broken down into fields and subfields, including fields not ideally covered by the other databases such as the humanities and arts. The ranking is based on h5-index which refers to articles published in the last five years. H = the number of articles in the journal that were cited H times (for example: 50 articles were cited at least 50 times).