ВЗАИМОДЕИСТВИЕ УНИВЕРСИТЕТОВ И БИЗНЕСА INTERACTION BETWEEN THE UNIVERSITY AND BUSINESS
ISSN 1999-6640 (print) http://umj.ru
ISSN 1999-6659 (online)
DOI 1Q.15826/umpa.2Q19.Q5.Q43
MEASURING UNIVERSITY ENGAGEMENT
D. M. Kochetkov"'b, N. Kh. Sadekovb, I. A. Gudkovab
aNational Research University Higher School of Economics 20 Myasnitskaya str., Moscow, 101000, Russian Federation;
[email protected] bPeoples' Friendship University of Russia (RUDN University) 6Miklukho-Maklaya str., Moscow, 117198 Moscow, Russian Federation gudkova-ia@rudn. ru
Abstract. This article presents a model for the evaluation of scientific research output from the standpoint of university engagement with the socio-economic environment based on a scientometric analysis of topical areas. The primary aim was to examine various interrelations between conventional and alternative scientometric indicators that most clearly reflect the relationship between universities, industry and society. Three countries and five topical research areas were chosen as the object of the study. A comparative analysis showed that conventional scientometric indicators correlate quite well with the indicators of social and commercial relevance of scientific research. However, since this relationship was not observed in the case of Brazil, an assumption was made about the influence of the national and disciplinary context. The evaluation of university engagement cannot be performed based exclusively on quantitative indicators, thus requiring qualitative assessment, e. g. peer review.
Keywords: university engagement, engaged university, third mission, community engagement, scientometric indicators, peer review
Acknowledgements: The reported study was funded by RFBR, project number18-00-01040 (18-00-01685) (recipients Dmitry Kochetkov & Nail Sadekov). The reported study was funded by RFBR, project number 18-00-01555(18-00-01685) (recipient Irina Gudkova).
For citation: Kochetkov D. M., Sadekov N. Kh., Gudkova I. A. Measuring University Engagement. University Management: Practice and Analysis. 2019; 23(5): 75-84. DOI: 10.15826/umpa.2019.05.043
Introduction
Until recently, universities have enjoyed great academic freedom. The liberal governance model implied autonomy (delegation) based on trust [1]. Since the 19th century, governments and private sponsors have been allocating significant resources for the development of universities without requiring much accountability in response. At that time, there was no clear link between the progress of science and economic growth in public consciousness.
The Second World War convincingly demonstrated the ample possibilities of science. In addition, the post-war fertility boom stimulated expenditures on higher education [2]. The increased spending led to a demand for greater accountability, as the society
became interested in how its tax money was spent. People required that knowledge gained by pure science be practically useful. Industry that directly or indirectly (through the tax system) funded science also wanted to maximize outcome for their money spent.
Towards the end of the 20th century, the concept of knowledge economy became the mainstream development paradigm. Within the framework of neoliberalism, science is increasingly being considered as a production process with its input and output parameters. The university has become a principal actor in the socio-economic system. Undeniably, the ties between the university, government, and business have existed long before. The theory of innovation, the backbone of which was laid by Schumpeter [3], can be distinguished into the following distinct areas:
- product design - diffusion of innovation [4];
- evolutionary - triple helix [5-9];
- organizational or strategic - open innovation [1017], agile innovations [18];
- political - national and regional innovation
systems [19-22].
The Triple Helix model proposed a new role for the university in the economy. The triple helix is applicable when overlapping of institutional spheres occurs. It is in the places of overlap that the phenomenon of the endless frontier of new knowledge generation arises, which is a prerequisite for the evolutionary development of systems [9].
The demand for greater science accountability raised the problem of new indicators for research productivity measurement. Until the 90s, research performance had been primarily assessed using such qualitative instruments, as peer review. However, the rapid development of information technologies coupled with growing scholarly output resulted in dominance of scientometric (quantitative) indicators over qualitative ones.
Do the results of peer review and scientometric indicators coincide? The results of a few studies thus far conducted have produced conflicting results. Thus, Mryglod et al. [23] found a strong correlation between quality and impact, although normalized per head indicators showed a rather weak correlation. It was argued that scientometric indicators are not suitable for assessment of research productivity in social sciences and humanities. At the same time, Harzing [24] found a strong link between the results of peer review carried out at British universities in the framework of REF (Research Excellence Framework) and the citation data retrieved from Microsoft Academic (MA). A recent study established that consistency between metrics and peer review is observed at the institutional level (rather than at the publication level), at least in the fields of physics, clinical medicine, public health, health services & primary care [25]. Nevertheless, it should be accepted that the entire evaluation procedure is becoming more impersonal.
At almost the same time, at the turn of the century, the first university rankings began to appear1. To a certain extent, they were designed to give a quantitative answer to the question of what should be done "in order to become Harvard". This presumption determined their bibliometric-based character; moreover, expert voting is also an impersonal procedure by nature. University rankings are a convenient quantitative
1 Strictly speaking, U. S. News ranking began in 1983 but it was aimed primarily at an American audience. The major globally recognized rankings appeared in 2000 beginning with Times Higher Education-QS World University Rankings in 2004.
tool, but their design presupposes their weaknesses. University rankings are rather a marketing tool for attracting resources (human and financial); their value for improving research performance remains unclear [26]. Most university rankings have a strong organizational profile of an American university inside; therefore, it does not come as surprise that most of the first places are occupied by American universities [27]. Rankings create "weak expertise," which is a compromise between the interests of key stakeholders and the robustness of methodology [28]. The ranking of the Three University Missions from Moscow State University2 stands apart. It is one of the first large-scale attempts to assess the engagement of universities in the solution of societal problems. In this context, U-Multirank3, which includes the indicators of regional engagement and knowledge transfer, should be mentioned.
Thus, the discussion around topics of measuring of university engagement in socio-economic processes is continuing. Bibliometric methods have limitations; at the same time, even ardent supporters of the peer review approach recognize the impossibility of using exclusively expert methods under the conditions of rapidly increasing information flows. In this study, we aim to show the applicability of alternative indicators for research performance evaluation. To this end, we set out to investigate those research areas in the technological frontier zone, where maximum commercial and socially relevant results can be expected.
The rest of the article is organized as follows: the following section presents a scientometric analysis of the recent research in the field of university engagement; further, we describe the applied methodology; the Results section summarizes the analysis of traditional and alternative scientometric indicators, as well as the correlation analysis. In the Discussion and Conclusion section, we provide interpretation of the results, present the examples of university cases and also discuss the results of the Three University Missions ranking for 2019.
Recent Research
An analysis of recent literature was carried out using VOSviewe r4. In addition to citation and co-authorship analysis, this software product possesses text mining functionality [29, 30]. At the first stage, we performed a topical search in the Scopus5 and
2 Available at: https://mosiur.org/ (accessed: 05.11.2019).
3 Available at: https://www.umultirank.org/ (accessed: 05.11.2019).
4 Available at: http://www.vosviewer.com/ (accessed: 0S.12.201S).
5 Available at: https://www.scopus.com (accessed: 0S.12.201S).
Web of Science6 databases. Documents were taken for five years 2014-2018. We identified terms that had occurred in combinations at least five times. Table 1 presents a comparative analysis of the results.
Table 1
Results of literature search*
Database Scopus Web of Science
Search query university* W/1 engage* university* NEAR/1 engage*
Number of documents 99б б18
2014-2018 47б 3б0
Article, review, article in press** 348 290
Number of terms бб 4б
* Source: authors' own analysis based on Scopus and Web of Science data.
** This type of document is available only in Scopus.
Subsequently, we opted for better coverage, i. e., Scopus database. At the next stage, we merged single-root words and synonyms and also eliminated the words not carrying the thematic load (e. g., articles), denoting research methods (e. g., questionnaire, interview, etc.) or denoting a specific location (e. g., the United Kingdom, United States). As a result, we received a scientometric map of 54 terms (Fig. 1).
б Available at: http://apps.webofknowledge.com (accessed: 08.12.2018).
The red cluster is a topic core. Note that most of "research" refers to university relations with society [31-37]; "innovation" [38] and "third mission'" [39, 40] point to connections with industry. The blue cluster contains documents related to the educational foundations of university processes, such as "learning" and "curriculum" [41-44]. It also includes the organizational aspects of the university processes: "organization and management" and "public relations" [45]. The green cluster represents the psychological foundations of higher education, with the centre of this class being formed by the identity of a student [46, 47]. A small yellow cluster combines "academic engagement" with "academic achievement" and "academic performance." Academic engagement, including academic entrepre-neurship, is often considered at the individual level [44]. Interestingly, the connecting term between the red and blue clusters is "public health" [48, 49], which indicates the focus of modern economic, social, political and educational systems on maintaining human health and wellbeing. At the same time, "social justice" is the unifying term for all 4 clusters [50].
A complete list of terms is given in Appendix 1. Each link has its own strength, represented by a positive numerical value. The higher this value, the stronger the link. The total link strength attribute indicates the total strength of the co-occurrence links of a given term with other terms. The average normalized citation score is a relative indicator. The mean value normalizes the values; thus, the mean value always equals 1 [51].
Fig. 1. Scientometric map of recent studies in university engagement. Source: authors' own analysis using
VOSviewer
Methodology
The data was retrieved from the Scopus database for the period between 2014 and 20177. This period can be considered sufficient for the evaluation of research processes. Three countries were selected for analysis: the Netherlands, Brazil and Russia. The Netherlands represents a country with a developed economy. At the same time, the Netherlands features a developed university system, which not only produces high-quality research results, but also has successfully commercialized its research. Brazil is a country with an emerging economy and a reasonably stable higher education system with a large share of the private sector. Russia, on the contrary, is characterized by the lion's share of public universities and large-scale attempts to improve the global competitiveness of its higher education system. For the analysis purposes, five areas were chosen, where commercially and socially relevant results can be expected:
- Biochemistry;
- Computer Science;
- Energy;
- Engineering;
- Medicine.
At the first stage, we analysed the values of conventional scientometric indicators for the indicated countries and research domains:
- The scholarly output is an indicator of the relative strength of a research area for a given object of analysis.
- Citation is an indicator of research impact. Citations were taken as normalized per paper. Further, we analysed two alternative indicators
that show the link between scientific research and industry:
- Share of industry co-authored papers, i. e., at least one author with a university affiliation and one author with an industry affiliation. It is an apparent link between university research and the economy. The advantage of this metric is realtime availability.
- Scholarly output cited by patents. This indicator is available with a time lag (2 years minimum). Finally, we introduced the indicator of the number of mentions in the media as an indicator of the social relevance of research. To this end, we had to go down to the level of analysis below, because mentions in the media usually refer to the university (author), rather than to the country or the research area as a whole.
We identified 30 universities with the most significant number of publications for each country and
7 The dataset is available at: https://data.mendeley.com/datasets/ c3snzdszm4/1
research area. We used correlation analysis to search for possible relationships. In this case, we proceeded from the following hypotheses:
1. The number of publications in collaboration with industry positively correlates with the total scholarly output.
2. The number of mentions in the media is related to the total number of publications and/or citations.
3. The number of citations of scientific publications in patents positively correlates with the total number of citations of scientific publications of a university.
4. The number of publications co-authored by industry positively correlates with the number of citations of university publications in patents.
The citation indicator was taken as an absolute value, since the indicator of references in the media cannot be normalized to the article.
Results
The results of a comparative analysis of conventional scientometric indicators and indicators of the commercialization of research are presented in Figure 2.
Russia has an advantage in engineering and energy; these areas are based on the foundation laid down back in the Soviet times. At the same time, in medicine, the supreme position of the Netherlands is evident; Russia's lag in this area is particularly significant. The Netherlands is leading in terms of scientific impact in almost all analysed domains. A similar picture can be observed concerning the share of industry co-authored articles and the number of citations in patents. This similarity suggests the existence of a correlation between these indicators. For the correlation analysis, we selected 30 universities with the highest number of publications for each subject area and country. Tables 2-6 represent the results of the correlation analysis.
Table 2
Publications vs. Mass Media*
Subject area/ Country Bio-chemistry Computer Science Energy Engineering Medicine
Brazil 0.14 0.21 0.23 -0.04 -0.10
Netherlands 0.S6 0.70 0.04 0.62 0.92
Russia 0.93 0.90 0.S3 0.S9 0.43
Total 0.6S 0.63 0.0S 0.60 0.05
* Source: authors' own analysis. Data source: SciVal by Elsevier.
There is an average correlation between the number of publications and media references in the field
Fig. 2. Comparative analysis of conventional and alternative scientometric indicators Source: authors' own analysis. Data source: SciVal by Elsevier.
of biochemistry, computer science and engineering. At the same time, Russia demonstrates a pronounced correlation between these indicators in all areas except medicine. In Brazil, however, these figures are not correlated with each other.
Table 3
Publications vs. Academic-Corporate Collaboration*
Subject area/ Country Bio-chemistry Computer Science Energy Engineering Medicine
Brazil 0.95 0.S4 0.44 0.31 0.17
Netherlands 0.S3 0.46 0.63 0.50 0.99
Russia 0.S9 0.S6 0.55 0.S9 0.S4
Total 0.61 0.37 0.44 0.34 0.6S
*Source: authors' own analysis. Data source: SciVal by Elsevier.
We found a moderate correlation between the number of publications in general and the number of publications in collaboration with industry. Again, in Russia, these indicators correlate in almost all areas.
The results of the correlation analysis of citations and media are very similar to those presented in Table 2. Therefore, a reasonable assumption can be made about the correlation between the number of publications and the number of citations.
Table 4
Citations vs. Mass Media*
Subject area/ Country Bio-chemistry Computer Science Energy Engineering Medicine
Brazil 0.04 0.23 0.02 -0.09 0.10
Netherlands 0.S7 0.75 0.02 0.67 0.91
Russia 0.92 0.S4 0.S2 0.S5 0.4S
Total 0.S3 0.72 0.0S 0.65 0.12
* Source: authors' own analysis. Data source: SciVal by Elsevier.
Citations correlate with the patent-citation count in almost all areas for Russia and the Netherlands; however, this relationship is not observed for Brazil. Thus, conventional scientometric indicators and indicators of social engagement correlate almost everywhere for Russia and moderately for the Netherlands. In Brazil, this relationship is absent in most cases. In addition, we analysed the relationship between the number of publications in collaboration with industry and the number of citations of university publications in patents.
We observed a very high correlation coefficient in the field of medicine for all the countries under study. Thus, the participation of practitioners in the preparation of a medical article is an essential condition for its use in a patent application.
79
Table 5
Citations vs. Patent-Citations Count*
Subject area/ Country Bio-chemistry Computer Science Energy Engineering Medicine
Brazil 0.77 0.31 0.19 0.4б 0.34
Netherlands 0.84 0.б7 0.70 0.85 0.89
Russia 0.92 0.73 0.55 0.80 0.82
Total 0.8б 0.б5 0.б1 0.74 0.70
*Source: authors' own analysis. Data source: SciVal by Elsevier.
Table б
Academic-Corporate Collaboration vs. Patent-Citations Count*
Subject area/ Country Bio-chemistry Computer Science Energy Engineering Medicine
Brazil 0.б9 0.37 0.42 0.14 0.99
Netherlands 0.72 0.51 0.55 0.б7 0.88
Russia 0.82 0.б8 0.28 0.75 0.83
Total 0.78 0.53 0.43 0.б8 0.95
*Source: authors' own analysis. Data source: SciVal by Elsevier.
Discussion and Conclusion
The results of the correlation analysis partially support the hypothesis about the relationship between conventional scientometric indicators and indicators of social and commercial relevance of research. In Russia, these indicators correlate in almost all the analysed areas; in the Netherlands, we also observed a correlation, but not in all areas. In Brazil, the relationship between the indicators in most cases is absent. We also found a relatively strong correlation between the number of publications in collaboration with industry and the number of citations of scholarly output in patents. This relationship is most strongly expressed in the field of medicine.
On the basis of the obtained results, we argue that national and disciplinary contexts significantly influence the evaluation of university engagement. In each research domain, established traditions affect the number of publications, citations, industrial partnerships and knowledge transfer. At the same time, the activities of a university are influenced by the national economic, political and cultural context. Our results do not support the global university - local university dichotomy. We can only talk about the matrix of a university's strategic choice (Fig. 3). In this Figure, the horizontal focus is on research vs. education, while the vertical orientation is global vs. local markets.
LOCAL
Fig. 3. The matrix of a university's strategic choice.
Source: authors' own analysis
It is essential that, under current conditions, a university cannot work exclusively at one of the poles horizontally; it can only make a strategic shift towards one direction or another. For example, it can be said that Harvard is somewhat more focused on education, while MIT - on research and technology transfer. However, it is difficult to imagine that one of these institutions will completely abandon research or education, respectively. Universities opt either for the global or local market. However, universities tend to be isomorphic: "they operate under similar incentive structures and imitate one another [52]."
The position of a locally engaged university also opens up plenty of strategic opportunities. Here is an example of the Zuyd University of Applied Sciences (the Netherlands)8, which is located on three campuses in Heerlen, Sittard and Maastricht. Zuyd is not included in the global university rankings. Its mission statement is short: "Professionals develop themselves with Zuyd." Zuyd University hosts 30 research centres. Associate professors, lecturers and students carry out practical and socially relevant research. They connect practice and education, contribute to innovations and R&D in the business sector. Research and knowledge transfer contribute to regional development and are designed in close cooperation with the regional or Euregional government bodies, the business world and educational institutions.
In the global or local market, the engagement mechanism works similarly. The thesis of the falsity of the opposition between global and local universities is also supported by the results of the The Three University Missions ranking. In the Top 10, we again observe the dominance of American universities, with Harvard and MIT ranking the first (Table 7).
8 Available at: https://www.zuyd.nl/en (date assessed 14.12.2018)
It is interesting to note that the leading group is stable in composition (we compared with the data in 2018); the only change is the emergence of Duke University in the 10th place, which replaced the Columbia University.
Table 7
Top 10 rankings of the Three University Missions*
1 Harvard University United States
2 Massachusetts Institute of Technology (MIT) United States
3 University of Pennsylvania United States
4 Yale University United States
5 University of Cambridge United Kingdom
6 University of Oxford United Kingdom
7 Stanford University United States
8 University of California, Berkeley United States
9 University of Chicago United States
10 Duke University United States
* Source: URL: https://mosiur.org/ranking/ (date accessed 06.11.2019).
We can assume that a modern university cannot function without a social mission and knowledge transfer. Nevertheless, we should note that this ranking still uses conventional scientometric indicators and a few altmetrics, such as views, the number of visitors of the university website and the number of subscribers to the university account in social media. Most local universities are out of sight due to low sci-entometric indicators (the ranking includes only 333 universities). In this case, we do need a peer review analysis.
It is not by chance that there are many examples of engaged universities in the Netherlands. The Dutch university evaluation system called the Standard Evaluation Protocol (SEP)9 is focused on assessing not only the quality of research but also its social significance. In particular, it contains Table D 1, where peers evaluate how effectively the university produces scientific knowledge for targeted social groups. The Dutch case is undoubtedly a positive experience, but it is not entirely clear how it can be scaled up. At the moment, we are not ready to offer a suitable organizational mechanism, but are open to discussion with interested readers.
Disclosure statement
The authors declare no conflict of interest.
9 Available at: https://www.knaw.nl/nl/actueel/publicaties/stand-ard-evaluation-protocol-2015-2021 (accessed: 14.12.2018).
References
1. Olssen M., & Peters M. A. Neoliberalism, higher education and the knowledge economy: from the free market to knowledge capitalism, Journal of Education Policy, 2005, vol. 20(3), pp. 313-345. (In Eng.).
2. Wissema J. G. Towards the Third Generation University: Managing the University in Transition. Cheltenham & Northampton: Edward Elgar, 2009. 252 p. (In Eng.).
3. Schumpeter J. A. The Theory of Economic Development. Cambridge: Harvard University Press, 1934. 255 p. (In Eng.).
4. Rogers E. M. Diffusion of Innovations (5th ed.). New York: Free Press, 2003. 576 p. (In Eng.).
5. Carayannis E. G., & Campbell D. F. J. "Mode 3" and "Quadruple Helix": toward a 21st century fractal innovation
ecosystem, International Journal of Technology Management, 2009, vol. 46(3/4), pp. 201-234. (In Eng.).
6. Carayannis E. G., & Campbell D. F. J. Triple Helix, Quadruple Helix and Quintuple Helix and how do knowledge, innovation and the environment relate to each other? A proposed framework for a trans-disciplinary analysis of sustainable development and social ecology, International Journal of Social Ecology and Sustainable Development, 2010, vol. 1(1), pp. 41-69. (In Eng.).
7. Etzkowitz H. Incubation of incubators: Innovation as a triple helix of university-industry-government networks, Science and Public Policy, 2002, vol. 29(2), pp. 115-128. (In Eng.).
8. Etzkowitz H., & Klofsten M. The innovating region: Toward a theory of knowledge-based regional development. R and D Management, 2005, vol. 35(3), pp. 243-255. https:// doi.org/10.1111/j.1467-9310.2005.00387.x (In Eng.).
9. Etzkowitz H., & Leydesdorff L. The dynamics of innovation: From National Systems and "mode 2" to a Triple Helix of university-industry-government relations, Research Policy, 2000, vol. 29(2), pp. 109-123. (In Eng.).
10. Belderbos R., Cassiman B., Faems D., Leten B., & Van Looy B. Co-ownership of intellectual property: Exploring the value-appropriation and value-creation implications of co-patenting with different partners, Research Policy, 2014, vol. 43(5), pp. 841-852. https://doi.org/10.1016/j. respol.2013.08.013 (In Eng.).
11. Bogers M., Chesbrough H., & Moedas C. Open innovation: Research, practices, and policies, California ManagementReview, 2018, vol. 60(2), pp. 5-16. https://doi.or g/10.1177/0008125617745086 (In Eng.).
12. Dahlander L., & Gann D. M. How open is innovation? Research Policy, 2010, vol. 39(6), pp. 699-709. https://doi. org/10.1016/j.respol.2010.01.013 (In Eng.).
13. Felin T., & Zenger T. R. Closed or open innovation? Problem solving and the governance choice, Research Policy, 2014, vol. 43(5), pp. 914-925. https://doi.org/10.1016/j. respol.2013.09.006 (In Eng.).
14. Laursen K., & Salter A. J. The paradox of openness: Appropriability, external search and collaboration, Research Policy, 2014, vol. 43(5), pp. 867-878. https://doi.org/10.1016/). respol.2013.10.004 (In Eng.).
15. Lopez-Vega H., Tell F., & Vanhaverbeke, W. Where and how to search? Search paths in open innovation, Research
Policy, 2016, vol. 45(1), pp. 125-136. https://doi.org/10.1016/j. respol.2015.08.003 (In Eng.).
16. Saebi T., & Foss N. J. Business models for open innovation: Matching heterogeneous open innovation strategies with business model dimensions, European Management Journal, 2015, vol. 33(3), pp. 201-213. https://doi.org/10.1016/j. emj.2014.11.002 (In Eng.).
17. West J., Salter A., Vanhaverbeke, W., & Chesbrough, H. Open innovation: The next decade, Research Policy, 2014, vol. 43(5), pp. 805-811. https://doi.org/10.1016/j. respol.2014.03.001 (in Eng.).
18. Darrell K. Rigby, Sutherl J., & Takeuchi H. The Secret History of Agile Innovation, Harward Business Review, 2016. Retrieved from https://hbr.org/2016/04/ the-secret-history-of-agile-innovation (In Eng.).
19. Freeman C. The 'National System of Innovation' in historical perspective, Cambridge Journal of Economics, 1995, pp. 5-24. https://doi.org/10.1093/oxfordjournals.cje. a035309 (In Eng.).
20. Grupp H., & Schubert T. Review and new evidence on composite innovation indicators for evaluating national performance, Research Policy, 2010, vol. 39(1), pp. 67-78. https:// doi.org/10.1016/j.respol.2009.10.002 (In Eng.).
21. Proksch D., Haberstroh M. M., & Pinkwart A. Increasing the national innovative capacity: Identifying the pathways to success using a comparative method, Technological Forecasting and Social Change,
2017, vol. 116, pp. 256-270. https://doi.org/10.1016/j. techfore.2016.10.009 (In Eng.).
22. Wu J., Ma Z., & Zhuo S. Enhancing national innovative capacity: The impact of high-tech international trade and inward foreign direct investment, International Business Review, 2017, vol. 26(3), pp. 502-514. https://doi.org/10.1016/j. ibusrev. 2016.11.001 (In Eng.).
23. Mryglod O., Kenna R., Holovatch Y., & Berche B. Comparison of a citation-based indicator and peer review for absolute and specific measures of research-group excellence, Scientometrics, 2013, vol. 97(3), pp. 767-777. https://doi. org/10.1007/s11192-013-1058-9 (In Eng.).
24. Harzing A.-W. Running the REF on a rainy Sunday afternoon: Do metrics match peer review? 2017. (In Eng.).
25. Traag V. A., Waltman L. Systematic analysis of agreement between metrics and peer review in the UK REF, Palgrave Commun, 2019, vol. 29, no. 5, pp. 1-12. https:// doi:10.1057/s41599-019-0233-x (In Eng.).
26. Vernon M. M., Andrew Balas E., & Momani S. Are university rankings useful to improve research? A systematic review, PLoS ONE, 2018, vol. 13(3), pp. 1-15. https://doi. org/10.1371/journal.pone.0193762 (In Eng.).
27. Safon V. What do global university rankings really measure? The search for the X factor and the X entity, Scientometrics, 2013, vol. 97(2), pp. 223-244. https://doi. org/10.1007/s11192-013-0986-8 (In Eng.).
28. Lim M. A. The building of weak expertise: the work of global university rankers, Higher Education,
2018, vol. 75(3), pp. 415-430. https://doi.org/10.1007/ s10734-017-0147-8 (In Eng.).
29. Van Eck N. J., & Waltman L. Software survey: VOSviewr, a computer program for bibliometric mapping, Scientometrics, 2010, vol. 84, pp. 523-538. https://doi.
org/10.1007/s11192-009-0146-3 (In Eng.).
30. Van Eck N. J., & Waltman L. Visualizing Bibliometric Networks, In Measuring Scholarly Impact, 2014, pp. 285320. https://doi.org/10.1007/978-3-319-10377-8_13 (In Eng.).
31. Chile L. M., & Black X. M. University-community engagement: Case study of university social responsibility, Education, Citizenship and Social Justice, 2015, vol. 10(3), pp. 234-253. https://doi.org/10.1177/1746197915607278 (In Eng.).
32. de Rassenfosse G., & Williams R. Rules of engagement: measuring connectivity in national systems of higher education, Higher Education, 2015, vol. 70(6), pp. 941-956. https://doi.org/10.1007/s10734-015-9881-y (In Eng.).
33. Fleischman D., Raciti M., & Lawley M. Degrees of co-creation: an exploratory study of perceptions of international students' role in community engagement experiences, Journal of Marketing for Higher Education, 2015, vol. 25(1), pp. 85103. https://doi.org/10.1080/08841241.2014.986254 (In Eng.).
34. Kindred J., & Petrescu C. Expectations Versus Reality in a University-Community Partnership: A Case Study, Voluntas, 2015, vol. 26(3), pp. 823-845. https://doi.org/10.1007/ s11266-014-9471-0 (In Eng.).
35. Levine P. A defense of higher education and its civic mission, Journal of General Education, 2014, vol. 63(1), pp. 47-56. https://doi.org/10.1353/jge.2014.0002 (In Eng.).
36. Mtawa N. N., Fongwa S. N., & Wangenge-Ouma, G. The scholarship of university-community engagement: Interrogating Boyer's model, International Journal of Educational Development, 2016, vol. 49, pp. 126-133. https:// doi.org/10.1016/j.ijedudev.2016.01.007 (In Eng.).
37. Whitley C. T., & Yoder S. D. Developing social responsibility and political engagement: Assessing the aggregate impacts of university civic engagement on associated attitudes and behaviors, Education, Citizenship and Social Justice, 2015, vol. 10(3), pp. 217-233. https://doi.org/10.1177/174619 7915583941 (In Eng.).
38. Trippl M., Sinozic T., & Lawton Smith H. The Role of Universities in Regional Development: Conceptual Models and Policy Institutions in the UK, Sweden and Austria, European Planning Studies, 2015, vol. 23(9), pp. 1722-1740. https://doi.org/10.1080/09654313.2015.1052782 (In Eng.).
39. Iorio R., Labory S., & Rentocchini F. The importance of pro-social behaviour for the breadth and depth of knowledge transfer activities: An analysis of Italian academic scientists, Research Policy, 2017, vol. 46(2), pp. 497-509. https:// doi.org/10.1016/j.respol.2016.12.003 (In Eng.).
40. Rosli A., & Rossi F. Third-mission policy goals and incentives from performance-based funding: Are they aligned? Research Evaluation, 2016, vol. 25(4), pp. 427-441. https://doi. org/10.1093/reseval/rvw012 (In Eng.).
41. Abuzar M. A., & Owen, J. A community engaged dental curriculum: A rural Indigenous outplacement programme, Journal of Public Health Research, 2016, vol. 5(1), pp. 27-31. https://doi.org/10.4081/jphr.2016.668 (In Eng.).
42. Crea T. M., & McFarland M. Higher education for refugees: Lessons from a 4-year pilot project, International Review of Education, 2015, vol. 61(2), pp. 235-245. https:// doi.org/10.1007/s11159-015-9484-y (In Eng.).
43. Dada O., Jack S., & George M. University-Business Engagement Franchising and Geographic Distance: A Case Study of a Business Leadership Programme, Regional Studies,
2016, vol. 50(7), pp. 1217-1231. https://doi.org/10.1080/00343 404.2014.995614 (In Eng.).
44. Frank A. I., & Sieh L. Multiversity of the twenty-first century - examining opportunities for integrating community engagement in planning curricula, Planning Practice and Research, 2016, vol. 31(5), pp. 513-532. https://doi.org/10.108 0/02697459.2016.1180573 (In Eng.).
45. Shiel C., Leal Filho W., do Paço A., & Brandli L. Evaluating the engagement of universities in capacity building for sustainable development in local communities, Evaluation and Program Planning, 2016, vol. 54, pp. 123-134. https://doi.org/10.1016Zj.evalprogplan.2015.07.006 (In Eng.).
46. Granado X. O., Mendoza Lira M., Apablaza C. G. C., & López V. M. M. Positive emotions, autonomy support and academic performance of university students: The mediating role of academic engagement and self-efficacy, Revista de Psicodidactica, 2017, vol. 22(1). https://doi.org/10.1387/ RevPsicodidact. 14280 (In Eng.).
47. Navarro-Abal Y., Gómez-Salgado J., López-López M. J., & Climent-Rodríguez J. A. Organisational justice, burnout, and engagement in university students: A comparison between stressful aspects of labour and university organization, International Journal of Environmental Research and Public Health, 2018, vol. 15(10). https://doi.org/10.3390/
ijerph15102116 (In Eng.).
48. Mazet J. A. K., Uhart M. M., & Keyyu J. D. Stakeholders in One Health, OIE Revue Scientifique et Technique, 2014, vol. 33(2), pp. 443-452. https://doi. org/10.20506/rst.33.2.2295 (In Eng.).
49. Spies L. A. Developing Global Nurse Influencers, Journal of Christian Nursing: A Quarterly Publication of Nurses Christian Fellowship, 2016, vol. 33(2), pp. E 20-E 22. (In Eng.).
50. Malfitano A. P. S., Lopes R. E., Magalhaes L., & Townsend E. A. Social occupational therapy: Conversations about a Brazilian experience, Canadian Journal of Occupational Therapy, 2014, vol. 81(5), pp. 298-307. https:// doi.org/10.1177/0008417414536712 (In Eng.).
51. Van Eck, N. J., & Waltman, L. VOSviewerManual (version 1.6.4), 2016, pp. 1-28. https://doi.org/10.3402/jac. v8.30072 (In Eng.).
52. Leydesdorff L., Bornmann L. & Mingers J., Statistical significance and effect sizes of differences among research universities at the level of nations and worldwide based on the leiden rankings, Journal of the Association for Information Science and Technology, 2019, vol. 70, pp. 509-525. https:// doi:10.1002/asi.24130 (In Eng.).
Submitted on 09.10.2019 Accepted on 12.11.2019
Information about the authors:
Dmitry M. Kochetkov - Head of the Center for Scientometrics and Development of the Scientific Journals, Peoples' Friendship University of Russia (RUDN University); [email protected].
Nail Kh. Sadekov - Leading Specialist of the Center for Scientometrics and Development of the Scientific Journals, Peoples' Friendship University of Russia (RUDN University); [email protected].
Irina A. Gudkova - PhD (Physics and Mathematics), Associate Professor, Applied Probability and Informatics Department, Peoples' Friendship University of Russia (RUDN University); 8-495-955-07-13; [email protected].
Appendix 1
Clusterization of the terms*
Label Cluster Links Total link strength Occurrences Avg. pub. year Avg. citations Avg. norm. citations
civic engagement red 2 2 6 2016.1l 2.00 0.54
community engagement red 12 16 10 2016.90 2.30 0.l9
developing countries red 9 11 5 2015.00 3.80 0.60
education red 36 86 23 2015.52 3.5l 0.95
engaged university red 9 9 6 2016.83 1.33 0.48
entrepreneurial university red 8 8 5 2016.20 3.00 1.59
higher education red 24 46 36 2016.14 2.86 0.82
innovation red 10 14 6 2016.00 4.33 1.26
local participation red 8 13 5 2016.40 1.20 0.28
organization red 9 13 6 2016.00 1.83 1.08
public health red 16 21 5 2014.80 3.40 0.66
research red 12 1l 8 2016.75 2.00 1.94
societies and institutions red 14 25 10 2015.60 6.90 2.42
Label Cluster Links Total link Occur- Avg. pub. Avg. cita- Avg. norm.
strength rences year tions citations
student engagement red 1 9 8 2016.15 1.38 1.39
sustainable development red 15 29 12 2016.25 6.00 1.39
teaching red 19 36 15 2015.80 6.61 1.26
technology transfer red 1 10 8 2016.50 1.38 0.54
third mission red 1 9 9 2016.61 2.61 1.18
university engagement red 11 18 30 2016.43 2.63 0.96
university sector red 34 11 24 2016.50 2.42 0.88
university-community engagement red 6 6 6 2011.11 0.11 0.68
adolescent green 24 11 9 2014.61 5.18 1.18
adult green 32 112 15 2015.60 3.53 1.03
exercise green 16 40 5 2015.40 1.40 2.12
female green 21 101 13 2015.08 5.23 1.06
male green 29 120 15 2015.13 4.60 0.93
middle aged green 16 35 5 2015.20 9.20 3.13
motivation green 24 35 9 2016.89 1.18 0.98
physical activity green 14 30 5 2015.60 3.80 1.16
physical education green 13 16 5 2016.00 3.60 1.01
psychology green 25 58 10 2015.10 4.10 1.43
statistics and numerical data green 12 31 5 2014.60 5.40 1.04
student green 38 129 32 2015.94 3.84 1.09
university student green 11 29 6 2016.50 1.50 1.35
young adult green 21 14 10 2014.90 5.50 1.26
community-institutional relations blue 11 39 6 2015.83 3.50 1.45
curriculum blue 21 39 11 2015.45 3.36 0.14
health promotion blue 22 38 6 2016.11 8.00 3.38
human blue 36 244 51 2015.13 3.69 1.25
human experiment blue 14 31 1 2016.43 1.14 0.59
leadership blue 16 22 5 2016.00 3.20 1.00
learning blue 25 48 11 2016.45 0.55 0.25
organization and management blue 18 39 1 2014.86 5.43 1.29
procedures blue 28 80 12 2015.25 5.11 1.31
public relations blue 22 50 1 2015.11 3.00 1.25
scientist blue 14 23 5 2011.20 1.20 0.10
university blue 43 195 48 2015.61 3.88 1.11
academic achievement yellow 15 18 5 2011.00 0.60 0.18
academic engagement yellow 13 22 9 2016.33 3.00 1.23
academic performance yellow 14 21 5 2016.80 2.60 1.39
social justice yellow 19 21 6 2011.00 1.61 0.34
*Source: authors' own development. Developed with VOSviewer.