Quality of Scholarly Journals and Major Selection Criteria for Coverage by the Web of Science
V.A. Markusova
All-Russian Institute for Scientific and Technical Information, Russian Academy of Sciences, Ul. Usievicha, 20, Moscow, 125190, Russia E-mail: markusova@viniti.ru Received 02.05.2012
ABSTRACT The major criteria for scholarly journals selection to be covered by the Web of Science are discussed. These criteria include the compliance of a journal with world standards, the international diversity, citation score of authors and editorial board members, the journal impact factor and journals self-citation. The demand for the unified transliteration of the authors name and the use of the unified English-language name and address of a research organization, as well as for including the information on funding organization (grant number and agency name) is emphasized.
KEYWORDS journals; impact factor; citation; self-citation; selection criteria; Journal Citation Report; Web of Science.
BIBLIOMETRIC INDICATORS AND THEIR ROLE IN THE EVALUATION OF THE RESEARCH ACTIVITY OF SCIENTIFIC INSTITuTIONS AND uNIVERSITIES The significant geopolitical changes that have taken place during the past two decades, including the dissolution of the USSR, have considerably altered the landscape of world science. Starting with the 1990s, a wave of market liberalization resulted in the establishment of a new economy, which was accompanied by an unprecedented level of industrial activity, increased investments into the high-technology sector and the corresponding changes in the economic structure. The governments of a number of countries, including the developing ones, consider science and technology to be the key factor of economic growth and development and thus have confronted the challenge of establishing a knowledge-intensive economy. The detailed pattern of the resulting significant changes in the number of students, PhD students, postdoctoral fellows, and researchers both in the U.S. and
in other industrial and developing countries was presented in the report of the U.S. National Science Foundation titled Science & Engineering Indicators-2012 (SEI), published in January 2012, www. nsf.gov. Impressive statistics on the investments made in basic research in the U.S., the European Union, China, Russia, and the other leading world countries was presented there. Furthermore, the measures undertaken by the developing countries for the development of a science and technology (S&T) infrastructure, for the stimulation of industrial research and development (R&D), for the expansion of the higher education system and establishment of national systems for basic research, as well as for the establishment of their own markets for trade and foreign investments, were discussed. As a result of the recent financial crisis, investments into innovation research and development are experiencing downward economic pressure; in this context, increasing attention is being put on assessing the yield and efficiency of the national innovation systems.
The 5th section of the Report was devoted to the bibliometric statistics characterizing the levels of research and development (R&D) in these countries. This statistics is based on the data contained in the Science citation Index-Expanded (ScI-E) and Social Science citation Index-Expanded (SScI-E), which are parts of the Web of Science (WoS) produced by the Thomson Reuters (hereinafter, TR).
The publication in a prestigious international journal covered by WoS is the most reliable method for bringing the results of the research to the attention of the world scientific community. Let us note that the experts who deal with assessing the output of scientific activity refer to all the journals covered by WoS as “international.” The journal’s prestige or its informational significance is assessed through the impact factor of a journal (IF).
In accordance with the Resolution of the Presidium of RAS № 212 dated October 12, 2009, “On the Establishment of the committee for the Assessment of the Activity of Scientific Institutions,” the
institutes of the Russian Academy of Sciences are in the process of collecting a vast body of statistical data, including bibliometric indicators, such as the number of papers published by institute researchers in various databases (DB), citation score , and the mean weighted average impact factor of an institution. The hidden stones of using these indicators were discussed in the journal Herald of the Russian Academy of Sciences [1]; it was noted that the assessments of the output of scientific activity should be based on the decisions by experts in the corresponding field of science, whereas the bibliometric indicators are an additional tool for decision making.
The global financial crisis of 2008 has forced most foreign investment organizations supporting fundamental research to focus more on bibliometrics as an objective indicator of the quality of a scientific product. Thus, the UK Research Assessment Exercises has tied the distribution of financial support to universities to the indicators of scientific output. A trend towards the increased attention by university administrations to publications in prestigious international journals with high impact factors has also been observed. Unfortunately, this trend had a negative result: namely, the disappearance of a number of national journals on social sciences in the Netherlands [2].
The paper published by Dr. E. Garfield in Science [3] in 1955, where he discussed the idea of citation indexing, was a historical moment in the use of the informational flows of scientific literature consisting of the papers and citations in them. The experimental “Science Citation Index” (SCI) in natural science and engineering was published in 1963. The Institute for Scientific Information (ISI), which belonged to E. Garfield, started publishing SCI on a regular basis since 1964. Since 2001, ISI has been own by the
Thomson Reuters company, the leader on the global market for information resources.
The scientific community rather rapidly welcomed the potential offered by the use of large collection of bibliographic information as a tool to assess the efficiency of the scientific activity of countries, universities, and research institutions. The bibliometric indicators of research output and its citedness in the U.S. and other countries were first published in 1972 in the report “Science Indicators” prepared by the U.S. National Science Foundation and since then have been published every other year. Since 1996, these reports have been known as Science & Engineering Indicators (SEI).
The compilation principle of all the information products launched by ISI is based on the fact that, beginning from the late XIX century, a paper published in a journal has been considered as a form of scientific communication. The well-known American expression “publish or perish” attests to the fact that the number of papers published is a significant factor of the professional acknowledgement of a researcher and facilitates his or her moving up the promotion ladder. The Nobel laureate and academician V.L. Ginzburg has mentioned that “The timely publication of the studies and support of the best ones is a necessary condition for success in research, namely, in providing the international acknowledgement of this success” [4].
Today, as it was 100 years ago, a scientific paper contains citations of predecessor research. If a paper contains no citations, this is considered as one of the signs of the low qualification of the beginning author and complicates the publication of the paper in a journal. When the author cites the works of other researchers, he or she demonstrates by so doing the conceptual relation-
ship between the subject of his or her paper and that of the works cited. Not being strictly formalized language, the citations enable to detect the internal conceptual links between publications [5].
The tens of thousands of articles, short notes, editorials, letters and reviews that are published daily in journals, as well as the millions of citations cited by these articles provide a way for penetrating knowledge communication, and its dissemination into science and the provide the empirical data on a significance of the research and scientific activity (6)..
Along with the accumulation of large collections of bibliographic information in ISI and the development of computation facilities in the U.S., there emerged an opportunity for the establishment of a new information product based on the relationships between journals. The Journal Citation Reports (JCR) for natural sciences and engineering, which contained statistical data for
3,000 journals, was first published in 1975; since that time, it has been published annually. Its special edition, JCR-Social Sciences, has been published since 1978. According to E. Garfield, “Many researchers and editors often make the even more outrages error of assuming that SCI was created just to produce its byproduct called Journal Citation Report (JCR)”. The major purpose for these resources was not only to aid information retrieval but also to use it as an alerting tool, i.e. for selective dissemination of information” (7). The term “impact factor” (IF) was first proposed by E. Garfield jointly with Dr. Irving Sher in 1955 [8]. The introduction of this term promoted a more qualitative selection of scholarly journals (hereinafter, journal) by libraries and information services. It soon gained popularity as a symbol of the scientific prestige of a journal, although its values significantly vary depending
on the branch of knowledge and its relevant subject field.
The elaboration and development of methods for the analysis of bibliographic information has resulted in the emergence of a new scientific discipline: scientometrics. E. Garfield has noted that “We are witnessing the transformation of bibliometric research into a new field of industry - the assessment of the output of the research carried out by university and scientific teams” [7]. Although the discontent of the scientific community with the enthusiasm of bureaucrats from foundations and ministries for various ratings and evaluations is growing, the impact these indicators have on the financial support that global fundamental science can count on is becoming increasingly noticeable.
Over the past decade, the progress in the development of the databases (DB) and information technoly of almost all social institutions and processes have resulted in the establishment of network technologies that enable one to work with very large collection of data. The information platform “Web of Knowledge” (WOK) is an example of a network technology of this type. One of its components, the information system Web of Science (WoS), comprises the expanded versions: SCI-Expanded, Social Science Citation Index-Expanded u Art & Humanities Citation Index. WoS is currently the largest citation DB with indexing of over 800 million citations in papers published over the period from 1900 until 2010. A total of 12,600 journals are covered by WoS. The attained level of development of network technologies allows one to use large data collection to resolve sciento-metric problems. In essence, we can speak about the establishment of “network” scientometrics. The network technology enables one to obtain more adequate evaluation of
the Russian science contribution to the global thesaurus of knowledge.
In 2005, the well-known scientific publisher Elsevier (The Netherlands) established the SCOPUS database (hereafter, SCOPUS) and posted it on the Internet; the system comprised articles and citations from
18,000 journals starting from 1996. A total of 230 Russian journals are used for this system. The number of Russian journals covered by the SCOPUS is higher compared to that for Web of Science; however, this system has a number of significant drawbacks. In particular, it is extremely difficult to obtain reliable results during the general search using the names of Russian research institutions and Russian Academy of Sciences. Still, Elsevier is very active in the Post-Soviet space; it organizes numerous workshops for potential users. The search for organizations in this system is more complicated compared to that in SCI. For a journal, it is easier to be covered into the SCOPUS system than into the Web of Science , since the “selection criteria” are considerably lower in the former there . Let us to emphasize that all the comparative evaluations of the national science contribution to global science and university ranking are based on bibliometric indicators from the Web of Science.
The monitoring of these bibliomet-ric indicators is carried out in all industrially developed countries. The data relating to research output assist in making strategic decisions on the directions of research development and evaluating the position of a research institution or a university with respect to global standards in a certain field of knowledge.
journals selection
CRITERIA FOR COVERAGE IN THE WEB OF SCIENCE (WoS)
When submission a journal for coverage into any foreign database, one needs to keep in mind the mission of the journal as the main channel of
scientific communication. The bibliographic data (including the correspondence address and e-mail of the author, working address of the organization, and source of financial support) and all citations within the journal are widely used to collect the bibliographic information and are regarded as important indicators of the researchers’ scientific activity.
A journal has different functions; in general, they provide ideas
- on the research‘ development of and its progress; competitive strength of science and the degree of its integration into the global scientific community;
- on the publication activity of the authors;
- on the publication activity and ranking of institutions;
- on the degree of recognition and the level of publications in the global scientific community, according to the citations of these papers; and
- on the quality of the national journals in comparison with the world flow of publications in the corresponding subject field, etc.
The research performed by English bibliographer S. Bradford was the theoretical foundation for the selection of journals in SCI. In 1930, he formulated one of the key regularities in the distribution of publications within scientific periodicals, which is known as the Bradford’s law of scattering. According to this law, three zones containing an equal number of articles devoted to a certain problem can be isolated in the list of journals belonging to a certain subject and arranged in such a manner that the number of articles devoted to this problem decreases. These zones differ in the number and quality of the journals comprising them: the first zone (the Bradford’s core) includes specialized journals (“core” journals) that are directly devoted to a certain subject ; the second zone includes the journals that are par-
tially devoted to the adjacent fields of knowledge; the third zone, the largest one, comprises the journals with subject fields remote from the specified one. The number of journals in these three zones is roughly in proportion to 1 : n : n2, where n depends on the subject field [9]. One of the major principles of acquisition of library funds and activity of all information products and services is based on this law.
Thus, the analysis of 7,621 journals covered in the Journal Citation Reports (JCR) in 2008 demonstrated that less than 300 journals accounted for approximately 50% of the citations and contained roughly 30% of the papers published.
A group of “core” journals consisting of 3,000 journals publishes approximately 80% of all papers; among those, 90% were cited at least once. This group of “core” journals changes in accordance with the evolution of science; therefore, the task of the company’s personnel is to update the list of journals covered, as well as to identify and evaluate new journals. The journals that have become less useful are eliminated from the database.
During the first years after the SCI was established, the editors of journals were wary of their publication coverage by SCI; then, the situation changed dramatically with the years. The recognition of the significance of the SCI by scientific community and the commercial success of SCI resulted in the fact that E. Garfield’s office was filled with editors’ requests: “Do not perish us, include in the SCI”
[10]. In the early 1970s, the caption “covered by SCI” first appeared on the covers of journals.
Three indicators connected both with the qualitative and quantitative characteristics of journals have been taken as a basis for the principles of journal selection process :
- citation data;
- compliance of a journal with cer-
tain scientific and publisher’s parameters (journal standard); and
- expert evaluation.
The TR company has used for several decades the technological and scientific achievements of the ISI, the unique information service that first started processing the references contained in scientific articles, letters to editors, editors’ columns, and book reviews published in journals. These data are invaluable sources of information pertaining to the qualitative characteristics used for the journals’ evaluation that have been in publication over a long period of time
(11).
Selection of new journals is mostly based on expertise of the journal’s quality. The evaluation of journals and their exclusion from this system is a continuous process. Every other week, journals are added or excluded from the TR databases. About 2,000 new journals annually are examined by the company staff; only 10-12% of those are selected for processing.
Continual monitoring of journals ensures that they comply with high scientific standards and stay relevant to the products in which they are covered. All journals covered by Web of Science undergo selection process regardless of the database they are supposed to be covered in: Science Citation Index-Expanded, Social Science Citation Index-Expanded, or Art & Humanities Citation Index. Special attention is given to an evaluation of journals on social sciences or Art & Humanities due to the specificity of the citations patterns in these fields of knowledge (lower citations indicators compared to those for natural sciences).
TR editors responsible for the journal evaluation have a degree in the particular field of science they work in. Since these people daily monitor journals, they have become experts in journals in their corresponding fields of science
Scientific article titles
Scientific article titles need to be informative:
- the commonly accepted abbreviations only can be used in article titles;
- no transliterations should be used in the titles of articles and abstracts translated into English; the exception is for untranslatable names, such as proper names, names of instruments, etc.; no untranslatable slang can be used. This also refers to the abstracts (summary) and key words.
Geographic aspect
If a journal is not an outstanding exception and its subject field is of interest only to a small region, it is unlikely to find itself in TR products. Experience demonstrates that the best articles written by authors from Third World countries have been published in international journals; therefore, if a choice is to be made out of two journals with the same specialization, internationally oriented journals are be preferred. Of course, this policy creates additional difficulties for authors from Third World countries with international ambition. TR has always been blamed (and this has now been generally acknowledged) for the fact that the service has biased American and western European journals. E. Garfield attributed this to the overabundance of research carried out in the U.S. and Western Europe, the results of this research being published in English, German, and French. Special attention has always been on the analysis of Soviet journals. If an ISI analysis has shown that a certain Soviet journal was not covered for preparation of the SCI but was highly cited by the other source journals, this journal was covered in the processing. For example, this was the case with Teplofizika (Thermal Physics). At time of writing, the
Web of Science indexes a total of 166 Russian journals; among them, 157 are covered in the Science Citation Index ; 6, in the Social Science Citation Index (two of those are actually American journals focused on the politics and economics of CIS, Russia, and China), and 3 Russian journals (Voprosy Istorii (Questions of History), Voprosy Filosofii (Questions of Philosophy), and Sotsiologicheskie Issledovaniia (Sociological Studies)), in the Art & Humanities Citation Index.
Depth of coverage of the subject field
The depth of the subject field coverage is one of the evaluation criteria when deciding whether a new journal should be covered by TR products. Based on private experience between the TR editor and a professional scientific community or publishing company, a decision for a new journal selection of launched by this scientific community or publishing company can be made. Unfortunately, good quality journals published earlier do not always guarantee that a new journal will be of a high quality, as well. It is natural to expect that a new journal published by a respectable publishing company with solid experience will be of high quality. However, it is rather frequent that publishing companies, pushed by members of a particularly interested group of specialists, begins premature publishing of a new journal. Moreover, the community of publishing companies is not monolithic with respect to publishing standards; there are significant discrepancies in the quality and periodicity of the publications. The same observations apply for a number of journals that are subsidized by the government or partially subsidized by other organizations. The fate of such publications may hinge on annual budget fluctuations.
Compliance of the journal with major standards
The timeliness of a publication is one of the criteria in the evaluation process. At the first stage of consideration of candidates for inclusion in the TR, the journal needs to be in line with the claimed periodicity. The ability of a journal to come out strictly on schedule means that the editorial board has a large hoard of unpublished articles in its portfolio. It is unacceptable for a journal to violate the established publication timeliness. In order to confirm the timeliness of publication, the editorial board needs to successively forward three current issues of the journal as soon as they are published to the TR company.
The TR company also takes notice of compliance with the International Publishing Convention, which optimizes facilities for searching for the source articles. They include such items as the informative title of the journal, illustrative representation of article titles and abstract, full bibliographic description of all the references contained in the articles, and full address data of each author and organization.
In the modern world, English is the universal language of science. For this reason, the journals published in English or at least having bibliographic descriptions and abstracts in English are considered first by TR. The WoS includes a sufficient number of journals with only bibliographic data published in English. However, it is obvious that the journals that will play the leading role in the international scientific community in the future will be entirely published in English. This is particularly valid for the field of natural sciences.
The peer review institute ensuring the general quality of the materials submitted and completeness of the cited references is another important indicator of a journal’s standard.
Contents of the journal
As mentioned above, the core of scientific literature is the basis for all scientific disciplines. This core is not static; it changes along with the evolution of science. The emergence of new fields of knowledge results in the appearance of new journals, while the materials published in the new field of science attain a certain critical value. The task of the editors is to determine whether a new journal will enrich the products in which it should be included or this subject area is represented sufficiently well without it.
The TR editors have access to a vast amount of cited literature and a long experience in daily work with new journals. Therefore, they are capable of evaluating emerging scientific trends and active fields of research that had been published in journals, as well as deciding whether it is necessary to cover a new journal by WoS.
International diversity of the authors and the editorial board
TR editors pay attention to the international diversity of the authors, editors, and members of the Editorial Board.. It is a significant factor for journals that are aimed to reach the international scientific community. The task of the company staff is to update the list of covered journals and to identify and evaluate new journals. Contemporary research is carried out all over the world; the international diversity of the journal is likely to be important for the global scientific community.
However, there is a large number of excellent regional journals that are intended mostly for the local audience rather than the international community. They usually are not characterized by a great variety of materials from different countries, and TR does not impose this requirement on such journals.
All regional journals selected to be covered by TR products need to
provide complete bibliographic information in English (article title, abstract, key words, authors’ addresses) and undergo the review process. All citations within the articles need to be presented in Latin letters. An appreciably large number of free software programs for creating bibliographic descriptions using Roman alphabeth can be found on the Internet. A Google search “create citation” will offer you several free software programs allowing automatic creation of citations in accordance with the specified standards (e.g., http://www. easybib.com/). It should be noted that descriptions can be created for different publication types (a book, a journal article, an Internet resource, etc.).
Citation analysis
Evaluating a journal is a unique process, since TR editors have an enormous collection of citation data. Attention should be put on the significance of the interpretation of these data and their comprehension. The quantitative citation indicators should be used only in the context of a journal that belongs to the same subject field. For example, there are not so many articles in the field of crystallography, and they contain a smaller number of citations as compared to those in the field of biotechnology or genetics. Or let us take another field, Art & Humanities, which is covered in the Art & Humanities Index (A&HI). The articles relating to this discipline require a considerably longer period of time to accumulate a significant number of citations. However, there is nothing unusual if an article from another field of science (e.g., life sciences) reaches its citation peak 2-3 years after publication. These facts need to be taken into account for a fair evaluation.
The citation indicators are evaluated at least at two levels. First, the citation indicators of the journal
are determined on the basis of the impact factor or the total number of references to this journal. Then, the citation indicators of individual authors are evaluated. This type of analysis is always useful, in particular when a new journal has yet to build a citation history.
This type of evaluation is performed for the journals that have not been covered by TR products and need a re-evaluation. It refers to the journals for which an increase in the number of citations has been observed due to various reasons: translation of articles into English, changes in the editorial policy, changes of publishing companies, etc. Free access of TR employees to all literature cited (over 12,500 journals) means that the company has access to data pertaining to the number of citations for the journals both covered and not covered in its products.
Self-citation
Self-citation is the number of references to the articles published in the same journal. For instance, if journal X was cited by all journals
15,000 times, including 2,000 citations in journal X, the share of selfcitation will be 2 x 100/15 = 13.3%.
All journals tend to publish articles in a certain subject field. Since scientific progress is based on preliminary studies, there is nothing unusual about some extent of selfcitation of the journal. However, it is appreciably difficult to detect deliberate self-citing.
A high level of self-citation is not unusual for the leading journals in a certain field of knowledge, since high-quality papers or the ones devoted to a new, rapidly developing scientific discipline are being constantly published in these journals. In the ideal situation, the authors cite their previous studies, since they are the most relevant for the work carried out, which does not depend on the journal they would
like to be published in. However, self-citation in these journals may become dominant in the total pool of citations. Self-citation can potentially distort the true value (position) of a journal in a particular subject field.
Among all the journals covered by the JCR-Science Edition, 80% have a share of self-citation below 20%. A significant deviation of this indicator from the normal value forces TR to re-examine the journal to determine what effect the excess self-citation had on the growth of the impact factor. If it is detected that self-citation was used improperly, the impact factor of this journal will not be published, and it will be considered reasonable to exclude the journal from Web of Science. There has been a myth in the scientific community that the editorial boards can manipulate the impact factor. Let us provide a curious example. The journal The World Journal of Gastroenterology had an excellent start. In 2000, its first IF was 0.993. During subsequent years, it rose to 1.445, 2.532, and 3.318. However, the journal was excluded from the JCR in 2004. The permanent increase in the IF was caused by self-citation rather than by the recognition of this journal by other authors. When calculating the IF, it was discovered that over 90% of the citations were self-citations. The journal was re-covered in the JCR in 2008; its IF was 2.081, the share of self-citation was as low as 8%.
In 2003, Marie McVeigh, Director of the JCR and Bibliographic Policy, examined the factors that cause high self-citation. It was found that self-citation is independent of the volume or subject field of the journal and is determined by the pattern of behavior of individual journals. The elimination of self-citation upon calculation of IF will have no effect on the ranking of most journals. When M. McVeigh detects journals whose IFs are mostly based
on self-citation, she acts as a parent would do with naughty children: she punishes them and gives them some time to “reform.”
The journals are given a short period of time (usually, several years), during which the editors are required to make aware their policy consequences. If they follow common norms and standards, they are re-covered in the JCR [12].
Addresses of organizations and authors
It is necessary to follow the existing rules of transliteration and use unified spelling (with Latin letters) of the authors’ last names and organization names. These data are of great significance for collection of analytical statistics on the activity of organizations and individual researchers. In WoS, there is a special “author finder” section which contains data on organizations where a researcher has worked and to his or her publications. The Russian Academy of Sciences wastes a large number of publications because Russian researchers forget to mention the fact that their organization is affiliated with the RAS.
If we would like to bolster the statistical presence for the RAS in global databases, one needs to ask the editors of journals published by the RAS to use the same English version of the name of a RAS institute. Of course, the authors need to use the same organization name; otherwise, it will be difficult to determine their real productivity.
Experience in working with the WoS has demonstrated that Russian scientists sometimes forget to specify the country (“Russia”) in the “organization address” field.
TR company has developed a researcher identifier (RESEARCHER ID) in open access for all Internet users: http://researcherid.com.
With a unique identifier assigned to each author in ResearcherID, you can
eliminate author misidentification and view an author’s citation metrics One needs to visit the aforementioned website, input his or her data, and write the commonly accepted English-language name of the Institute you are working. The data pertaining to your identifier will be subsequently sent to you via e-mail. This will allow to automatically add your publications into your profile and save up to 10,000 bibliographic records obtained upon searching in the WoS.
Sources of financial support
The data on the sources of financial support, provided in the last section of the article, including grant numbers, are very important information enabling one to search by “funding agency” in WoS. Therefore, journal editorial board members should pay attention to the presence of such data.
Social sciences and the Social Science Citation Index
All social science journals undergo the same evaluation process as the natural science and engineering journals. The indicators considered are as follows: compliance of the journal with conventional standards, including the timeliness of publication of the journal; journal content; international authors; and the number of citations. The citation statistics takes into account the fact that the general citation score in the field of social sciences generally are considerably lower than those in the field of natural sciences and engineering. Special attention is paid to the journals that focus on regional studies, since they play a special role in the field of social sciences. It is the local specialization that is of interest for research.
Art & Humanities and the Art & Humanities Citation Index
The compliance with conventional standards (including the timely
publication) is important for Art & Humanities, as well. However, the citation patterns in the field of Art & Humanities do not always correspond to certain predictable models used in social and natural sciences and engineering. Moreover, articles in the field of Art & Humanities frequently cite books, music compositions, as well as literature and art works. The presence of the English text is not obligatory in certain fields of Arts and Humanities, in which it is not needed for the national specificity of the study object (e.g., studies devoted to regional literature).
Journal Website
Journal website is one of the indicators of its visibility on the Internet. It is reasonable to specify on the website which foreign and Russian s process the journal. Prior to designing the website, we recommend having a look at foreign analogues of your subject area. The links should be provided on both Russian- and English-language websites. Guidelines for authors for a manuscript submission should be timely updated and correspond to the facilities provided by new information technologies. The website should be constantly updated.
procedure for submission OF A journal TO THE THOMSON REuTERS
When requesting inclusion of a journal in TR, one needs to doublecheck the following:
- Titles of the articles need to be informative - this requirement is considered by the expert system to be a major one;
- Only commonly accepted abbreviations can be used in article titles;
- No transliterations from the Russian language should be used in the titles of articles and abstracts translated into English; exception is made for untranslatable names,
such as proper names, instruments, etc.; no untranslatable slang known only to Russian-speaking specialists can be used;
- Each paper in the journal should be peer reviewed;
-Timeliness of the publications is mandatory;
- Each printed and/or electronic material should have an ISSN;
- A list of references using Latin letters should be provided;
- The journal title should have an English-language name;
- Publication year is mandatory;
- Volume and issue of the publication are required;
- An English title of the article should be provided;
- Either page numbers or article number (article number is not the DOI number). If a journal uses the numeration of both pages and arti-
cles, they should be listed individually, similar to that indicated in parentheses (art. № 23, pp. 6-10, not 23.6.-23.10);
- An English-language abstract should be provided for each scientific article;
- Authors’ last names and their addresses should be provided, including the e-mail of the author responsible for the distribution of reprints;
- All article identifiers, such as DOI, PII, and other article numbers are needed;
- Complete content list of each issue should include data pertaining to the pages/number of each article (except when only one article was published in the journal);
- Information on the source of financial support and grant number (if any) is needed;
-Assignment of these identifiers both in the original articles and in the references helps in the use of the articles containing citations, as well as the cited ones, and proper identification by the abstracting and indexing services.
To submit a journal to the Web of Science, one needs to
1. Establish an ongoing, complimentary subscription to the title for Thomson Reuters;
2. Send several most current issues of the journal;
3. Forward each subsequent issue of the journal to the following address: Thomson Reuters ATTN: PUBLICATION PROCESSING, 1500 Spring Garden Street, Fourth Floor, Philadelphia, PA 19130 USA.
REFERENCES
1. Ivanov V.V., Varshavskii A.E., Markusova V.A. //Herald of the RAS. 2011. № 7. P. 587-593.
2. Halfman W., Leydesdorff L. Is inequality among universities increasing? Gini coefficients and the elusive rise of elite universities. www.loet@leydesdorff.net
3. Garfield E. Citation Indexes for Science // Science. 1955. T. 122. № 3159. P. 108-111.
4 . Ginzburg V.L. Sami vinovaty? Pochemu Rossiia poluchaet malo Nobelevskikh premii (Themselves to Blame? Why did Russia Get a Few Nobel Prizes) // Poisk. 2007. № 47. P. 4.
5. Price D.J. de S. Little Science, Big Science. New York: Columbia University Press. 1963.
6. Garfield E. How ISI selects jopurnals for coverage: Quantitative andQualitative Considerations//Current Contents. 1990. № 22. P. 2-10.
7. Garfield E. A Century of Citation Indexing. Key note address // 12 th COLLNET Meeting, September 20-23, 2011. Istanbul: Istanbul Bilgi University, 2011.
8. Garfield E. The Agony and the Ecstasy - The History and Meaning of the Journal Impact Factor// J. Amer. Med. Association. 2006. V. 295. № 1. P.90-93.
9. Mikhailov A.I., Chernyi A.I., Giliarevskii R.S. Nauchnye kommunikatsii i informatika (Scientific Communications and Informatics). Moscow: Nauka, 1976. 435 p.
10. Garfield E. Errors - theirs, ours, yours // Current Contents. 1974. V. 25. P.5-6.
11. Testa J. The Thomson Reuters journal selection process http://thomsonreuters.com/products_services/science/ free/essays/journal_selection_process/
12.Gaming the Impact Factor Puts Journal In Time-out. www.scholarlykitchen.sspnet.org
An extended session of the Scientific Publishing Council of the Russian Academy of Sciences with representatives of Thomson Reuters headed by Vice President J. Testa was held on May 14, 2012. Mr. Testa delivered his report on journals selection policy to be covered by the Web of Science, and answered questions from the audience.
The meeting between J. Testa and the Vice Presidents of the Russian Academy of Sciences, A.I. Grigor'ev and S.M. Aldoshin, was held to discuss the results of the extended session of the Scientific Publishing Council. During the meeting, the subsequent joint collaboration between the RAS and Thomson Reu-
ters company was discussed in terms of the following directions:
- Possible expansion of coverage of Russian journals to be covered in the Web of Science;
- The results of the free-of-charge test access of the RAS institutes to the Web of Science;
- The international conference on scientometrics issues that is planned to be held by the Institute for the Study of Science jointly with the Thomson Reuters company on October 10-12, 2013; and
- Establishment of a working group to solve problems of bibliometrics and collaboration between the Russian Academy of Sciences and the Thomson Reuters company.