Научная статья на тему 'Building of multi-factor model of world university ranking systems'

Building of multi-factor model of world university ranking systems Текст научной статьи по специальности «Экономика и бизнес»

CC BY
49
9
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
БАГАТОФАКТОРНА МОДЕЛЬ / СВіТОВі РЕЙТИНГОВі СИСТЕМИ УНіВЕРСИТЕТіВ / ФАКТОРНИЙ АНАЛіЗ / РЕТРОСПЕКТИВНИЙ АНАЛіЗ / МНОГОФАКТОРНАЯ МОДЕЛЬ / МИРОВЫЕ РЕЙТИНГОВЫЕ СИСТЕМЫ УНИВЕРСИТЕТОВ / ФАКТОРНЫЙ АНАЛИЗ / РЕТРОСПЕКТИВНЫЙ АНАЛИЗ / MULTIFACTOR MODEL / WORLD UNIVERSITY RANKING SYSTEMS / FACTOR ANALYSIS / RETROSPECTIVE ANALYSIS

Аннотация научной статьи по экономике и бизнесу, автор научной работы — Kavitska V., Liubchenko V.

Now we have a lot of world ranking system that are based on a set of indicators for the universities evaluation, so it is difficult determine which indicators are essential in the ranking systems. So the aim of the article is to provide an integrated multifactor model of university ranking systems. The influential world and world university ranking systems are analyzed. Factor analysis of the influential world and world university ranking systems based on a retrospective analysis of three years were applied. Research in several groups and with normalized ranking indicators values for stable and the objective result was conducted. Integrated multifactor model of world university ranking systems is proposed. The proposed model allows to evaluate the majority of the world’s universities.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «Building of multi-factor model of world university ranking systems»

7. Pillat, R. M. BPMNt: A BPMN extension for specifying software process tailoring [Text] / R. M. Pillat, T. C. Oliveira, P. S. C. Alencar, D. D. Cowan // Information and Software Technology. — 2015. — Vol. 57. — P. 95-115. doi:10.1016/j. infsof.2014.09.004

8. Awad, A. On efficient processing of BPMN-Q queries [Text] / A. Awad, S. Sakr // Computers in Industry. — 2012. — Vol. 63, № 9. — P. 867-881. doi:10.1016/j.compind.2012.06.002

9. Meyer, A. Automating data exchange in process choreographies [Text] / A. Meyer, L. Pufahl, K. Batoulis, D. Fahland, M. Weske // Information Systems. — 2015. — Vol. 53. — P. 296-329. doi:10.1016/j.is.2015.03.008

10. Respicio, A. Reliability of BPMN Business Processes [Text] / A. Respicio, D. Domingos // Procedia Computer Science. — 2015. — Vol. 64. — P. 643-650. doi:10.1016/j.procs.2015.08.578

11. Correia, A. Adding Preciseness to BPMN Models [Text] / A. Correia, F. B. e Abreu // Procedia Technology. — 2012. — Vol. 5. — P. 407-417. doi:10.1016/j.protcy.2012.09.045

12. Juric, M. B. WSDL and BPEL extensions for Event Driven Architecture [Text] / M. B. Juric // Information and Software Technology. — 2010. — Vol. 52, № 10. — P. 1023-1043. doi:10.1016/j.infsof.2010.04.005

13. Krizevnik, M. Data-bound variables for WS-BPEL executable processes [Text] / M. Krizevnik, M. B. Juric // Computer Languages, Systems & Structures. — 2012. — Vol. 38, № 4. — P. 279-299. doi:10.1016/j.cl.2012.06.001

14. Marwaha, P. Formalizing BPEL-TC Through n-Calculus [Text] / P. Marwaha, H. Banati, P. Bedi // International Journal of Web & Semantic Technology. — 2013. — Vol. 4, № 3. -P. 11-21. doi:10.5121/ijwest.2013.4302

15. Mazanek, S. Constructing a bidirectional transformation between BPMN and BPEL with a functional logic programming language [Text] / S. Mazanek, M. Hanus // Journal of Visual Languages & Computing. — 2011. — Vol. 22, № 1. — P. 66-89. doi:10.1016/j.jvlc.2010.11.005

16. Van der Aalst, W. M. P. Translating unstructured workflow processes to readable BPEL: Theory and implementation [Text] / W. M. P. Van der Aalst, K. Bisgaard Lassen // Information and Software Technology. — 2008. — Vol. 50, № 3. — P. 131-159. doi:10.1016/j.infsof.2006.11.004

17. Dijkman, R. M. Semantics and analysis of business process models in BPMN [Text] / R. M. Dijkman, M. Dumas, C. Ouyang // Information and Software Technology. — 2008. — Vol. 50, № 12. — P. 1281-1294. doi:10.1016/j.infsof.2008.02.006

18. Zakharchenko, V. Information model of the design process of technical objects [Text] / V. Zakharchenko, V. Nenia //In Processing of the 1st International Conference «Computer Science & Engineering 2013». — Lviv, Ukraine, 2013. — P. 198-199.

РАЗРАБОТКА КОМПЛЕКСНОГО ОПИСАНИЯ ПРОЦЕССОВ ДЕЯТЕЛЬНОСТИ В ОРГАНИЗАЦИИ

Обнаружена проблема отсутствия единого подхода к информационному представлению всего жизненного цикла процессов деятельности в организации. Предложено комплексное взаимосвязанное информационное описание их аспектов. Это описание представляется в текстовом, графическом, формальном и алгоритмическом видах как части единой информационной модели для автоматизации процессов деятельности на всех этапах жизненного цикла.

Ключевые слова: процесс деятельности, комплексное описание, содержательное описание, графическое описание, формальное описание, описание для программной реализации.

Захарченко Вiкторiя nempieHa, асистент, кафедра комп'ю-терних наук, Сумський державний утверситет, Украта, e-mail: victoriaIT@ukr.net.

Марченко Анна BiKmopieHa, кандидат техшчних наук, доцент, кафедра комп'ютерних наук, Сумський державний утверситет, Украта.

Неня Вжтор Григорович, кандидат техтчних наук, доцент, кафедра комп'ютерних наук, Сумський державний утверситет, Украта.

Окопний Руслан Петрович, астрант, кафедра комп'ютерних наук, Сумський державний утверситет, Украта.

Захарченко Виктория Петровна, ассистент, кафедра компьютерных наук, Сумский государственный университет, Украина. Марченко Анна Викторовна, кандидат технических наук, доцент, кафедра компьютерных наук, Сумский государственный университет, Украина.

Неня Виктор Григорьевич, кандидат технических наук, доцент, кафедра компьютерных наук, Сумский государственный университет, Украина.

Окопный Руслан Петрович, аспирант, кафедра компьютерных наук, Сумский государственный университет, Украина.

Zakharchenko Viktoriia, Sumy State University, Ukraine, e-mail: victoriaIT@ukr.net.

Marchenko Anna, Sumy State University, Ukraine. Nenia Viktor, Sumy State University, Ukraine. Okopnyu Ruslan, Sumy State University, Ukraine

ИБС 004.832

001: 10.15587/2312-8372.2016.75712

ПОБУДОВА БАГАТОФАКТОРНО1 МОДЕЛ1 СВ1ТОВИХ РЕйТИНГОВИХ СИСТЕМ УН1ВЕРСИТЕТ1В

Розглядаеться проблема багатофакторног оцтки утверситетгв рейтинговыми системами. Проаналгзовано впливов1 свтовг та свтовг рейтинговI системи утверситетгв. Проведено факторний аналгз впливових свтових I свтових рейтингових систем. Дослгдження проводилось в декшькох групах, а також з урахуванням нормованих значень тдикаторгв для забезпечення стабшьного I об'ективного результату. Запропоновано ттегровану багатофакторну модель свтових рейтингових систем утверситетгв.

Ключов1 слова: багатофакторна модель, свтовгрейтинговI системи утверситетгв, факторний аналгз, ретроспективний аналгз.

Кавщька В. С., Любченко В. В.

1. Introduction

Ranking systems are widely used and applied in various fields of the economic, social and political activity

in the world educational space. Ranking systems meet the market demand of consumers of educational services and the labor market to the reputation of the university, contribute to enhancing the participation of target groups

in the formation of modern requirements to the quality of graduates.

The higher education system is constantly exposed to dynamic changes in the political and legal, social, economic, international, scientific and technological, environmental, socio-cultural and other spheres. Ensuring the quality of higher education depends on the adequate pre-emptive response of higher education institutions to these changes in this situation.

Thus, the list of reasons that lead to continuous monitoring of higher education and the use of rankings of higher educational institutions are defined. In particular, the acute international competition of universities — students and teachers learn and compare the quality of higher education outside their home country; gradually formed a unified international point of view on how a quality of university should be defined [1].

2. The object of research and its technological audit

The object of research is the university ranking systems. The university, in its turn, is a complex object, because it has a complex structure that consists of different departments, employees and various activities. Each component of the university has both quantitative and qualitative properties. The evaluation process involves the formation of quantitative characteristics of the object, taking account of all its properties and functions that are essential in a particular task, and that is the result of evaluation in accordance with the ranking system.

Let the university describes of the set of properties S = {si, s2, ..., sm}. In the framework of ranking system to the university presented a set of requirements V = {»i, v2, ..., vn}, that are determined by indicators of ranking system. Then it is necessary to establish a direct correspondence of properties of the university to requirements of ranking system F(V) ^ S.

A large number of reasons for comparative analysis of the university generates different, sometimes opposite, aims of these studies. Applicants and their parents, employers, investors look at this problem from different perspectives. This, in turn, leads to the construction of evaluation models that are significantly different from one to another:

a) objects of evaluation: universities, some specialty or training programs;

b) a list of indicators that are taken into account;

c) the importance (weights) of indicators;

d) the method of forming the ranking calculation.

Each world ranking system is based on a set of indicators for the universities evaluation. It is difficult to determine which indicators are essential in the ranking systems. Therefore, in practice, we have to take quite a number of indicators, and some ranking agencies used by hundreds of indicators. An analysis of the literature [2-4] leads to the conclusion that the problem of determining the latent factors of the world ranking system is an important task.

3. the aim and objectives of research

Aim of research — to provide an integrated multi-factor model of world university ranking systems.

To achieve this aim it is necessary to perform the following objectives:

1. Conduct review of world university ranking systems.

2. Collect original data on the indicators of considered world university ranking systems.

3. Conduct factor analysis and identify latent factors of considered world university ranking systems.

4. Literature review

To build a multi-factor model of world university ranking systems we need to consider indicators of well-known ranking systems that are used for university evaluation. To ensure objectivity and stability of model we need to consider the following groups of ranking systems.

Firstly, we choose the set of the most influential world university rankings (IWRS):

— Academic Ranking of World Universities (ARWU);

— QS World University Rankings (QS);

— Times Higher Education World University Rankings (THE).

Secondly, we added the set of less influence, but popular world ranking systems (WRS):

— CWTS Leiden Ranking (CWTS);

— Ranking Web or Webometrics (Webometrics);

— SCImago Institutions Rankings (SIR).

Let's consider each ranking systems that are used for university evaluation.

The ARWU was first published in June 2003 by the Center for World-Class Universities, Graduate School of Education of Shanghai Jiao Tong University, China, and updated on an annual basis. Since 2009 ARWU has been published and copyrighted by ShanghaiRanking Consultancy. More than 1200 universities are actually ranked by ARWU every year and the best 500 are published [5].

Universities are ranked by several criteria: quality of education (10 %), quality of faculty (40 %), research output (40 %), per capita performance (10 %). The highest scoring institution is assigned a score of 100, and other institutions are calculated as a percentage of the top score. Thus, the resulting score is normalized to the range [0, 100].

ARWU use indicators such as the total number of the alumni of an institution winning Nobel Prizes and Fields Medals, and the total number of the staff of an institution winning Nobel Prizes in Physics, Chemistry, Medicine and Economics and Fields Medal in Mathematics. It is allow to determine outstanding universities, but there is no possibility to determine evaluation of regular universities, which are the majority of the world universities. Also, ARWU uses SCI (Science Citation Index)/SSCI (Social Science Citation Index) papers and papers published in Nature and Science as indicators of research output. The SCI/SSCI indicator determines only the quantity of papers and doesn't consider the quality of papers (the citations). The Nature/Science indicator determines extremely outstanding research only in certain subject disciplines. So ARWU focused on highly outstanding research, and its indicators can't determine a wide range of scientific researches, that doesn't allow to evaluate the majority of universities in the world.

QS helps students make comparisons of world universities. QS assesses universities in four areas: research, teaching, employability and internationalization. Each of the indicators has different weight when calculating the overall scores. The main problem of QS is that peer review accounted for 33 % of the criteria. The high percentage of peer review can affect the results of the evaluation of world universities [6].

The THE is performance tables that judge research-intensive universities across missions: teaching, research, knowledge transfer, and international outlook. The performance indicators of THE are grouped into five areas: teaching (the learning environment), research (volume, income, and reputation), citations (research influence), international outlook (staff, students, and research), industry income (knowledge transfer). THE also has peer review accounted for 33 % of the indicators. Also THE doesn't take into account universities that have less than 200 scientific publications per year [7].

The CWTS is based on publications in Thomson Reuters' Web of Science database. Within Web of Science, only so-called core publications, which are publications in international scientific journals, are included. In addition, only article and review published within Web of Science are considered. So CWTS also focused on highly outstanding research [8].

The Webometrics is the academic ranking of higher education institutions. Since 2004 and every six months, an independent scientific exercise is performed by the Cybermetrics Lab for the providing multidimensional information about the performance of world universities based on their web presence and impact [9]. The indicators of Webometrics are:

— Presence — presence or size. The initial data is taken the number of pages (and sites on subdomains) in the Google index;

— Impact — influence or visibility. The initial data come from external links to the site of the university according to MajesticSEO.com and Ahrefs.com;

— Openness — openness, document files. The initial data is taken the number of documents at the site of the university, known Google Scholar;

— Excellence — excellence, scientific publications. The initial data is taken the number of scientific publications, 10 % of the most cited (on science domains) according to SCImago.

However, the content of Internet sites don't reflect the quality of education at the university.

SIR is a science evaluation resource to assess worldwide universities and research focused institutions. Indicators are divided into three groups intended to reflect scientific, economic and social characteristics of institutions. The SIR includes both, size-dependent and size-independent indicators; that is indicators influenced and not influenced by the size of the institutions. In this manner, the SIR provides overall statistics of the scientific publication and other output of institutions, at the same time that enables comparison between institutions of different sizes. However, SIR is entirely based on bibliometric indicators of the quality of research [10].

5. Materials and methods of research

We analyzed the data accordingly to the following procedure:

1. The original data were collected on the university ranking systems. Information was collected on the indicators with available values for each ranking system among the Top-50 universities.

2. The research was conducted in three groups of ranking systems, namely: «IWRS», «WRS», «All rating systems (ARS)». Note that research in these groups and

retrospective analysis determines the objectivity and stability of research results.

3. The correlation between the indicators was calculated in each group.

4. Factor analysis was performed and latent factors were identified in each group. Factor analysis was performed by principal component analysis using statistical package IBM SPSS 20.

5. The research was conducted in groups «WRS (normalized values)», «ARS (normalized values)» which consisted in the fact that all indicators were normalized to the range [0, 100]:

, , k — ^imin A r\r\ / A \

ki = ~b—IX—!*100, (1)

"-imax "-imin

where ki — current ranking indicator; kimin — minimum value of ranking indicator in the group; kimax — maximum value of ranking indicator in the group.

Let's provide indicators that were considered in the research for 2013, 2014 and 2015 years in following groups.

There were collected such indicators for the Top-50 universities in the group «IWRS» in 2013, 2014, 2015 years:

— Alumni — alumni of an institution winning Nobel Prizes and Fields Medals;

— Award — staff of an institution winning Nobel Prizes and Fields Medals;

— HiCi — highly cited researchers in 21 broad subject categories;

— N&S — papers published in Nature and Science;

— PUB — papers indexed in Science Citation Index-expanded and Social Science Citation Index;

— PCP — per capita academic performance of an institution;

— Academic Reputation — best institutions within field of expertise by voting academics;

— Employer Reputation — the survey of employers to identify the universities they perceive to be producing the best graduates;

— Faculty Student Ratio — measure of the number of academic staff employed relative to the number of students enrolled;

— International faculty ratio — proportion of international faculty members at the institution;

— International student ratio — proportion of international students at the institution;

— Citations Per Faculty — the total citation count according to Scopus is assessed in relation to the number of academic faculty members at the university;

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

— Teaching — the learning environment;

— International Outlook — staff, students, and research;

— Research — volume, income, and reputation;

— Citations research influence — the number of times a university's published work is cited by scholars globally, compared with the number of citations a publication of similar type and subject is expected to have;

— Industry income — knowledge transfer;

— Number of FTE Students;

— Students:Staff Ratio;

— International Students.

Note that indicators in group «IWRS» are stable — there are not changed for three years. Also in the group «IWRS» all indicators are evaluated with incomparable scales in

the range [0, 100], except one indicator «Number of FTE Students».

Present the indicators that were collected for the Top-50 universities in the group «WRS» in 2013, 2014, 2015 years that shown in Table 1.

Table 1

Indicators in group «WRS»

Year Group 2013 2014 2015

CWTS Leiden CWTS Leiden CWTS Leiden Ranking:

Ranking1: Ranking2: P (impact);

P (impact); P (impact); PP (top 1 %);

PP (top 10 %); PP (top 10 %); PP (top 10 %);

MCS; MCS; PP (top 50 %);

MNCS; MNCS; P (collab);

P (collab); P (collab); PP (collab);

PP (collab); PP (collab); PP (int collab);

World PP (int collab); PP (int collab); PP (industry);

ranking PP (UI collab); PP (UI collab); PP (<100 km);

systems MGCD; PP (<100 km); PP (>5000 km);

Webometrics: PP (>1000 km); Webometrics:

Presence; Webometrics: Presence;

Impact; Presence; Impact;

□penness; Impact; □penness;

Excellence; □penness; Excellence;

SCImago Institutions: Excellence; SCImago Institutions3:

Innovative Know- SCImago Institutions: Website Size

ledge Innovative Knowledge

Note: 1. Provide indicators that were considered in the research for CWTS Leiden Ranking in 2013:

P (impact) — number of publications 2008-2011. Collaborative publications are counted fractionally.

PP (top 10 %) — proportion of top 10 % publications: the proportion of the publications of a university belonging to the top 10 % of their field.

MCS — mean citations score: average number of citations of the publication of a university.

MNCS — mean normalized citations score: average number of citations of the publication of a university normalized for field differences and publication year.

P (collab) — number of publications 2008-2011.

PP (collab) — proportion of interinstitutional collaborative publications: the proportion of the publications of a university co-authored with one or more other organizations.

PP (int collab) — proportion of international collaborative publications: the proportion of the publications of a university co-authored by two or more countries.

PP (UI collab) — proportion of collaborative publications with industry: the proportion of the publications of a university co-authored with one or more industrial partners.

MGCD — mean geographical collaboration distance (km): average geographical collaboration distance of the publication of a university.

2. Provide indicators that were considered in the research for CWTS Leiden Ranking in 2014:

P (impact) — number of publications 2009-2012. Collaborative publications are counted fractionally.

PP (top 10 %) — proportion of top 10 % publications: the proportion of the publications of a university belonging to the top 10 % of their field.

MCS — mean citations score: average number of citations of the publication of a university.

MNCS — mean normalized citations score: average number of citations of the publication of a university normalized for field differences and publication year.

P (collab) — number of publications 2009-2012.

PP (collab) — proportion of interinstitutional collaborative publications: the proportion of the publications of a university co-authored with one or more other organizations.

PP (int collab) — proportion of international collaborative publications: the proportion of the publications of a university co-authored by two or more countries.

PP (UI collab) — proportion of collaborative publications with industry: the proportion of the publications of a university co-authored with one or more industrial partners.

PP (<100 km) — proportion of short distance collaborative publications: the proportion of the publications of a university with the geographical collaboration distance of less than 100 km.

PP (>5000 km) — proportion of long distance collaborative publications: the proportion of the publications of a university with the geographical collaboration distance of more than 5000 km.

3. There is no result of Innovative Knowledge for the period of research in January 2016.

In group «All ranking systems» all the indicators of Top-50 universities in 2013, 2014, 2015 years were combined into one group to provide an objective and stable result.

Provide indicators that were normalized in the range [0, 100] in groups «WRS» «ARS» shown in Table 2.

Table 2

Indicators that were normalized in the range [0, 100]

Year Group 2013 2014 2015

P (impact); P (impact); P (impact);

MCS; MCS; P (collab);

MNCS; MNCS; Presence;

World P (collab); P (collab); Impact;

ranking MGCD; Presence; □penness;

sys- Presence; Impact; Excellence;

tems Impact; □penness; Website Size

□penness; Excellence;

Excellence; Innovative Knowledge

Innovative Knowledge

All ranking systems Number of FTE Students; P (impact); MCS; MNCS; P (collab); MGCD; Presence; Impact; □penness; Excellence; Innovative Knowledge Number of FTE Students; P (impact); MCS; MNCS; P (collab); Presence; Impact; □penness; Excellence; Innovative Knowledge Number of FTE Students; P (impact); P (collab); Presence; Impact; □penness; Excellence; Website Size

6. Result of research

To proof objectivity and stability of multi-factor model we have to consider the result in each group. Present the results of the factor analysis in group «IWRS» for 2013, 2014 and 2015, that shown in Table 3.

Table 3

Result of the factor analysis in group «IWRS»

Year 2013 2014 2015

Group

1) Alumni; 1) Alumni; 1) Alumni;

2) Award; 2) HiCi; 2) HiCi;

3) HiCi; 3) N&S; 3) N&S;

4) N&S; 4) PUB; 4) PUB;

Influential world ranking systems 5) PUB; 6) PCP; 5) PCP; 6) Number of 5) PCP; 6) International

7) Employer Reputation; FTE Students; □utlook;

8) Number of FTE 7) Teaching; 7) Number of FTE

Students; 8) Research; Students;

9) Teaching; 10) Research; 11) Citations 9) Citations 8) Industry income; 9) Students:Staff Ratio; 10) Teaching; 11) Research; 12) Citations

Note that indicators were stable and presented in the same scale of measurement, except one indicator «Number

of FTE Students», in the group «IWRS». The result of factor analysis in this group is stable — an intersection of sets of latent factors in three years is 9.

Present the results of the factor analysis in groups «WRS» and «WRS (normalized values)» for 2013, 2014 and 2015, that shown in Table 4.

Table 4

Result of the factor analysis in groups «WRS» and «WRS (normalized values)»

Year 2013 2014 2015

Group

1) PP (collab); 1) PP (collab); 1) PP (collab);

2) PP (UI collab); 2) P (collab); 2) PP (int collab);

3) PP (int collab); 3) PP (UI collab); 3) PP (top 1 %);

4) PP (top 10 %); 4) PP (>1000 km); 4) PP (top 10 %);

World 5) MGCD; 5) P (impact); 5) PP (top 50 %);

ranking 6) MCS; 6) PP (top 10 %); 6) Exellence;

systems 7) MNCS; 7) MCS; 7) PP (industry);

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

8) Presence; 8) MNCS; 8) PP (>5000 km);

9) Impact; 9) Openness; 9) Presence;

10) Openness; 10) Impact; 10) Impact;

11) Exellence 11) Exellence 11) Openness

1) MNCS; 1) Innovative Knowledge; 1) PP (top 1 %);

World ranking systems (normalized values) 2) MGCD; 2) P (impact); 2) PP (top 10 %);

3) P (collab); 3) P (collab); 3) Website Size;

4) MCS; 4) Presence; 4) PP (top 50 %);

5) Presence; 5) Impact; 5) PP (industry);

6) Impact; 6) Openness; 6) Presence;

7) Openness; 7) Exellence 7) Impact;

8) Exellence 8) Openness; 9) Exellence

Note, the indicators in the group «WRS» are represented in different scales of measurement, namely: in 2013 — 79 % of indicators, in 2014 — 60 %, in 2015 — 47 %. In these groups indicators were not stable because the indicators of ranking system CWTS Leiden Ranking are changed in three years (they are presented average 50 % of the indicators in the group «WRS»).

An intersection of sets of latent factors in group «WRS» in three years is 5, in group «WRS (normalized values)» — is 4. This result can be regarded as an objective result, but the benefits of using normalized values of indicators are not visible.

Present the results of the factor analysis in groups «ARS» and «ARS (normalized values)» for 2013, 2014 and 2015, that shown in Table 5.

Indicators in the group «ARS» presented in different measurement scales, namely: in 2013 y. — 35 % of indicators, in 2014 y. — 31 %, in 2015 y. — 26 %. An intersection of sets of latent factors in group «ARS» in three years is 10, in group «ARS (normalized values)» — is 15. This result can be regarded as an objective result, that shown benefits of using normalized values of indicators.

Analyzing the results of the research of 2013, 2014, 2015 years, we can discover the latent factors shown in Table 6. These results were obtained by determining the intersection of the sets of resulting latent factors in each group for three years.

The result provides an integrated multi-factor model of world university ranking systems and can be considered in each group: influential world, world ranking and all ranking systems. Also, calculations with normalized values of the indicators are more stable — the intersection of sets «IWRS» and «ARS» is 2, and sets «IWRS» and «ARS (normalized values)» is 8.

Table 5

Result of the factor analysis in groups «ARS» and «ARS (normalized values)»

Year 2013 2014 2015

Group

1) Award; 1) PUB; 1) Alumni;

2) HiCi; 2) P (collab); 2) Award;

3) N&S; 3) PP (collab); 3) HiCi;

4) PUB; 4) PP (int collab); 4) N&S;

5) PCP; 5) PP (UI collab); 5) PUB;

6) PP (collab); 6) P (impact); 6) PCP;

7) P (collab); 7) PP (top 10 %); 7) P (collab);

8) PP (int collab); 8) PP (>1000 km); 8) PP (collab);

9) PP (UI collab); 9) PP (<100 km); 9) P (impact);

All ranking systems 10) PP (top 10 %); 10) MNCS; 10) PP (top 1 %);

11) MGCD; 11) MCS; 11) PP (top 10 %);

12) MNCS; 12) Innovative Knowledge; 12) PP (top 50 %);

13) MCS; 13) Teaching; 13) PP (industry);

14) Research; 14) Research; 14) Presence;

15) Presence; 15) Citations; 15) Impact;

16) Impact; 16) International Outlook; 16) Openness;

17) Openness; 17) Industry income; 17) Exellence

18) Exellence 18) International Students; 19) Students:Staff Ratio; 20) Number of FTE Students; 21) Openness; 22) Impact

1) PUB; 1) Alumni; 1) Alumni;

2) MNCS; 2) Award; 2) Award;

3) MGCD; 3) HiCi; 3) HiCi;

4) MCS; 4) N&S; 4) N&S;

5) P (collab); 5) PUB; 5) PCP;

6) PP (collab); 6) PCP; 6) PUB;

7) PP (int collab); 7) P (collab); 7) P (collab);

8) PP (UI collab); 8) PP (collab); 8) PP (collab);

9) Students:Staff 9) P (impact); 9) PP (int collab);

All ranking systems (normalized values) Ratio; 10) PP (<100 km); 10) PP (industry);

10) Number of 11) MNCS; 11) P (impact);

FTE Students; 12) MCS; 12) PP (top 1 %);

11) Research; 13) PP (top 10 %); 13) PP (top 10 %);

12) Citations; 14) PP (>1000 km); 14) PP (top 50 %);

13) International 15) Innovative Knowledge; 15) Teaching;

Outlook; 14) Impact; 15) Openness; 16) Exellence 16) Exellence 16) Research; 17) Citations; 18) Students:Staff Ratio; 19) Number of FTE Students; 20) Industry income; 21) Impact; 22) Exellence; 23) Presence

table 6

Latent factors in each group

Group Influential world ranking systems World ranking systems World ranking systems (normalized values) All ranking systems All ranking systems (normalized values)

Alumni; PP (int P (collab); Award; PUB; Alumni; Award;

HiCi; N&S; collab); Presence; P (collab); HiCi; N&S;

PUB; PCP; PP (top Impact; PP (collab); PUB; PCP;

Number 10 %); Openness; PP (int col- P (collab);

of FTE PP (collab); Exellence lab); PP (collab);

Students; Openness; P (impact); PP (int collab);

Latent Teaching; Impact; PP (top Impact;

fac- Research; Excellence 10 %); Openness;

tors Citations Research; Exellence;

Impact; Students: Staff

Openness; Ratio;

Exellence Number of FTE

Students;

Research;

Citations

7. SWOT-analysis of research results

Strengths:

— Uses of multi-factor model universities evaluate on significant indicators that reduces the time of evaluation while preserving the adequacy of results;

— Uses of significant indicators make it easier to check input data and avoid mistakes in input data on indicators;

— Multi-factor model provides stable and objective results because calculations are carried out with retrospective on three years;

— Multi-factor model proposes a new ranking system (actually meta-ranking system) based on indicators and data of existing rankings.

Weaknesses:

— Multi-factor model is focused on the evaluation the world's universities and not investigated at the national level.

Opportunities:

— Development of university ranking methods to improve the adequacy of results;

— Development of information system for the world university ranking system;

— Development of national ranking system. Threats:

— The emergence of foreign analogue of multi-factor model.

8. Conclusion

In the result of the research was developed integrated multi-factor model of world university ranking systems. To achieve this aim have been performed objectives:

1. Review of world university ranking systems was conducted. In each ranking system drawbacks were identified, and that was the prerequisite for the building of the multi-factor model.

2. Original data on the indicators of considered world university ranking systems were collected. Information was collected on the indicators with available values for ranking system with sample of Top-50 universities.

3. Factor analysis of the influential world and world ranking systems was conducted. The research was presented for 2013, 2014 and 2015 years — this retrospective analysis determines the stability and objectivity of the results. Also, research was presented in different groups to provide objective results — we can see latent factors in each group and integrated vision of latent factors in world university ranking systems.

References

1. Reitynh vyshchykh navchalnykh zakladiv III-IV rivniv akredy-tatsii MONmolodsportu za 2010/2011 navchalnyi rik [Electronic resource] // Ministry of Education and Science of Ukraine. — Available at: \www/URL: http://old.mon.gov.ua/ua/activity/ education/58/rejting/. — 01.04.2016. — Title screen.

2. Mmantsetsa, M. University Rankings: The Many Sides of the Debate [Text] / M. Mmantsetsa, J. W. Peter, F. Silvia // Management of Sustainable Development. — 2014. — Vol. 6, № 1. — P. 39-42. doi:10.2478/msd-2014-0006

3. Hazelkorn, E. World-class universities or world-class systems?: Rankings and higher education policy choices [Text] / Е. Hazelkorn; E. Hazelkorn, P. Wells, M. Marope (eds) // Rankings and Accountability in Higher Education: Uses and Misuses. — Paris: UNESCO, 2013. — P. 71-94.

4. Huang, Mu-Hsuan. A Comparison of Three Major Academic Rankings for World Universities: From a Research Evaluation Perspective [Text] / Mu-Hsuan Huang // Journal of Library and Information Studies. — 2011. — Vol. 9, № 1. — P. 1-25.

5. Academic Ranking of World Universities [Electronic resource]. — Available at: \www/URL: http://www.shanghairanking.com/ ru/aboutarwu.html. — 07.04.2016. — Title screen.

6. QS World University Rankings [Electronic resource]. — Available at: \www/URL: http://www.topuniversities.com. — 07.04.2016. — Title screen.

7. Times Higher Education World University Rankings [Electronic resource]. — Available at: \www/URL: https://www.timeshig-hereducation.com/world-university-rankings. — 08.04.2016. — Title screen.

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

S. CWTS Leiden Ranking [Electronic resource]. — Available at: \www/URL: URL: http://www.leidenranking.com. — 08.04.2016. — Title screen.

9. Ranking Web of Universities [Electronic resource]. — Available at: \www/URL: http://www.webometrics.info. — 09.04.2016. — Title screen.

10. Scimago Institutions Rankings [Electronic resource]. — Available at: \www/URL: http://www.scimagoir.com. — 09.04.2016. — Title screen.

ПОСТРОЕНИЕ МНОГОФАКТОРНОЙ МОДЕЛИ МИРОВЫХ РЕЙТИНГОВЫХ СИСТЕМ УНИВЕРСИТЕТОВ

Рассматривается проблема многофакторной оценки университетов рейтинговыми системами. Проанализированы влиятельные мировые и мировые рейтинговые системы университетов. Проведен факторный анализ влиятельных мировых и мировых рейтинговых систем. Исследования проводились в нескольких группах, а также с учетом нормированных значений индикаторов для обеспечения стабильного и объективного результата. Предложена интегрированная многофакторная модель мировых рейтинговых систем университетов.

Ключевые слова: многофакторная модель, мировые рейтинговые системы университетов, факторный анализ, ретроспективный анализ.

Кавщька Вжтс^я СергИвна, старший викладач, кафедра системного програмного забезпечення, Одеський нацюнальний полтех-тчний утверситет, Украта, e-mail: kavickaya.v.s@gmail.com. Любченко Bipa BiKmopieHa, доктор техтчних наук, профе-сор, кафедра системного програмного забезпечення, Одеський нащональний полтехтчний утверситет, Украта.

Кавицкая Виктория Сергеевна, старший преподаватель, кафедра системного программного обеспечения, Одесский национальный политехнический университет, Украина.

Любченко Вера Викторовна, доктор технических наук, профессор, кафедра системного программного обеспечения, Одесский национальный политехнический университет, Украина.

Kavitska Viktoriia, Odessa National Polytechnic University, Ukraine, e-mail: kavickaya.v.s@gmail.com.

Liubchenko Vira, Odessa National Polytechnic University, Ukraine

i Надоели баннеры? Вы всегда можете отключить рекламу.