Научная статья на тему 'MOBILE LEARNING PLATFORM FOCUSED ON LEARNING MONITORING AND CUSTOMIZATION: USABILITY EVALUATION BASED ON A LABORATORY STUDY'

MOBILE LEARNING PLATFORM FOCUSED ON LEARNING MONITORING AND CUSTOMIZATION: USABILITY EVALUATION BASED ON A LABORATORY STUDY Текст научной статьи по специальности «Медицинские технологии»

CC BY
48
14
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
MOBILE LEARNING / LEARNING CUSTOMIZATION / LEARNING MONITORING / SOFTWARE QUALITY / USABILITY EVALUATION

Аннотация научной статьи по медицинским технологиям, автор научной работы — Del Ángel-Flores H., López-Domínguez E., Hernández-Velázquez Y., Domínguez-Isidro S., Medina-Nieto M.A.

The learning customization and monitoring are considered key aspects of the teaching-learning processes. Some works have proposed mobile learning systems that provide teachers and students learning monitoring and personalization services. One of the main requirements of these kinds of systems in terms of software quality is usability; however, few works have addressed the usability issues using laboratory studies with users in real domains. In this work, we present a usability evaluation of the learning monitoring and personalization services of a mobile learning platform based on a laboratory study in which nine teachers and ten students participated. In our usability evaluation, the aspects evaluated were effectiveness, efficiency, and level of user satisfaction as proposed by the ISO/IEC 25000 family of standards. The results show that the teachers presented effectiveness, efficiency, and satisfaction considered satisfactory, while the students presented effectiveness and satisfaction classified as satisfactory and acceptable efficiency. The usability evaluation described in this work can serve as a reference for developers seeking to improve learning monitoring and personalization services development.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «MOBILE LEARNING PLATFORM FOCUSED ON LEARNING MONITORING AND CUSTOMIZATION: USABILITY EVALUATION BASED ON A LABORATORY STUDY»

DOI: 10.15514/ISPRAS-2022-34(3)-7

Mobile Learning Platform focused on Learning Monitoring and Customization: Usability Evaluation Based on a Laboratory Study

1 H. del Ángel-Flores, ORCID: 0000-0001-8035-9301 <hdelangel.mca19@lania.edu.mx> 2 E. López-Domínguez, ORCID: 0000-0002-6167-6309 <eduardo.lopez.dom@cinvestav.mx> 3 Y. Hernández-Velázquez, ORCID: 0000-0002-5767-532X<yeseniahv@gmail.com> 1 S. Domínguez-Isidro, ORCID: 0000-0002-9546-8233 <saul.dominguez@lama.edu.mx> 4M.A. Medina-Nieto, ORCID: 0000-0001-6391-4799 <maria.medma@uppuebla.edu.mx> 4 J. de la Calleja, ORCID: 0000-0002-6846-3162 <jorge.delacalleja@uppuebla.edu.mx>

1 Laboratorio Nacional de Informática Avanzada, Veracruz, México

2 Center for Research and Advanced Studies of the National Polytechnic Institute,

CDMX, Mexico 3 Universidad Veracruzana, Veracruz, México 4 Polytechnic University of Puebla, Puebla, México

Abstract. The learning customization and monitoring are considered key aspects of the teaching-learning processes. Some works have proposed mobile learning systems that provide teachers and students learning monitoring and personalization services. One of the main requirements of these kinds of systems in terms of software quality is usability; however, few works have addressed the usability issues using laboratory studies with users in real domains. In this work, we present a usability evaluation of the learning monitoring and personalization services of a mobile learning platform based on a laboratory study in which nine teachers and ten students participated. In our usability evaluation, the aspects evaluated were effectiveness, efficiency, and level of user satisfaction as proposed by the ISO/IEC 25000 family of standards. The results show that the teachers presented effectiveness, efficiency, and satisfaction considered satisfactory, while the students presented effectiveness and satisfaction classified as satisfactory and acceptable efficiency. The usability evaluation described in this work can serve as a reference for developers seeking to improve learning monitoring and personalization services development.

Keywords: mobile learning; learning customization; learning monitoring; software quality; usability evaluation

For citation: del Ángel-Flores H., López-Domínguez E., Hernández-Velázquez Y., Domínguez-Isidro S., Medina-Nieto M.A., de la Calleja J. Mobile Learning Platform focused on Learning Monitoring and Customization: Usability Evaluation Based on a Laboratory Study. Trudy ISP RAN/Proc. ISP RAS, vol. 34, issue 3, 2022, pp. 89-110. DOI: 10.15514/ISPRAS-2022-34(3)-7

Платформа мобильного обучения, ориентированная на

мониторинг и настройку обучения: оценка удобства использования на основе лабораторного исследования

1Х. дель Анхель-Флорес, ORCID: 0000-0001-8035-9301 <hdelangel.mca19@lania.edu.mx>

2 Э. Лопес-Домингес, ORCID: 0000-0002-6167-6309 <eduardo.lopez.dom@cinvestav.mx> 3 Е. Эрнандес-Веласкес, ORCID: 0000-0002-5767-532X<yeseniahv@gmail.com>

1 С. Домингес-Исидро, ORCID: 0000-0002-9546-8233 <saul.dominguez@lania.edu.mx>

4М.А. Медина-Ниэто, ORCID: 0000-0001-6391-4799 <maria.medina@uppuebla.edu.mx>

4Х. де ла Каллеха, ORCID: 0000-0002-6846-3162 <jorge.delacalleja@uppuebla.edu.mx>

1 Национальная лаборатория перспективной информатики, Мексика, Веракрус

2 Центр перспективных исследований Национального политехнического института,

Мексика, CDMX 3 Университет Веракрус, Мексика, Веракрус 4 Политехнический университет Пуэблы, Мексика, Пуэбла,

Аннотация. Индивидуализация обучения и мониторинг считаются ключевыми аспектами процессов преподавания и обучения. В некоторых работах предлагались мобильные системы обучения, которые предоставляют учителям и учащимся услуги мониторинга и персонализации обучения. Одним из основных требований к такого рода системам с точки зрения качества программного обеспечения является удобство использования; однако лишь в нескольких работах рассматривались вопросы удобства использования с использованием лабораторных исследований с пользователями в реальных доменах. В этой работе мы представляем оценку удобства использования сервисов мониторинга и персонализации обучения мобильной обучающей платформы на основе лабораторного исследования, в котором приняли участие девять учителей и десять студентов. В нашей оценке удобства использования оценивались такие аспекты, как эффективность, результативность и уровень удовлетворенности пользователей, как это предлагается в семействе стандартов ISO/IEC 25000. Результаты показывают, что учителя оценили эффективность, результативность и удовлетворенность как удовлетворительные, в то время как учащиеся оценили эффективность и удовлетворенность как удовлетворительные и эффективность как приемлемую. Оценка удобства использования, описанная в этой работе, может служить справочным материалом для разработчиков, стремящихся улучшить мониторинг обучения и разработку сервисов персонализации.

Ключевые слова: мобильное обучение; индивидуализация обучения; мониторинг обучения; качество программного обеспечения; оценка удобства использования

Для цитирования: дель Анхель-Флорес Х., Лопес-Домингес Э., Эрнандес-Веласкес Е., Домингес-Исидро С., Медина-Ниэто М.А., де ла Каллеха Х. Платформа мобильного обучения, ориентированная на мониторинг и настройку обучения: оценка удобства использования на основе лабораторного исследования. Труды ИСП РАН, том 34, вып. 3, 2022 г., стр. 89-110. DOI: 10.15514/ISPRAS-2022-34(3)-7.

1. Introduction

Currently, learning monitoring and customization are considered essential aspects of the teaching-learning processes [1], [2]. Learning monitoring is any procedure that provides feedback and information on student progress, which leads to their self-assessment or reflection of the learning process. Furthermore, learning monitoring helps identify students' competencies, what they know, and what they do [2]. The purpose of learning monitoring is to advise students, offer guidance, correct mistakes, help them overcome difficulties in the learning process, and keep track of the process followed by students [2].

On the other hand, learning customization is based on the idea that students learn in different ways and at different rates. It considers the student's knowledge, needs, abilities, and perceptions in the learning process. Therefore, it is considered learner-centered learning training [1]. Both learning monitoring and personalizing complement the teaching-learning process.

However, in practice, it is difficult to carry them out within the framework of traditional education since it is difficult for a teacher to keep track of each of his students and even more difficult to personalize his education based on his skills and abilities. In this context, some works have proposed mobile learning systems [3] - [18]. Nevertheless, the mobile learning platform presented in [18] is characterized by providing the teachers and students with various learning monitoring and personalization services that include mobile learning objects, considering the students' learning styles and context information. This platform comprises three main components: a mobile learning object generator system (SiGOAM), a mobile learning object repository (MLOR), and a mobile application (MoApp) focused on the student. The integration of these three components allows and facilitates the teacher the systematized implementation of various strategies for learning monitoring and customization. In terms of software quality, one of the main requirements that this type of system must consider is usability, which refers to the degree to which a software product can be used by a certain group of users to achieve clearly defined objectives with effectiveness, efficiency, and satisfaction [19], [20].

However, there is limited research work [7], [21]-[27] focused on the usability evaluation of mobile learning systems that identify and address usability issues using laboratory studies with students and teachers in a real domain. Some of the possible problems derived from a lack of usability evaluation in these types of systems are potential errors in the interface design, unacceptable ease of use, errors that the user makes when interacting with the software in a real environment, among others [28]. This work presents a usability evaluation of the learning monitoring and personalization services of the mobile learning platform proposed in [18] based on a laboratory study in which nine teachers and ten students participated. In our usability evaluation, the aspects evaluated were effectiveness, efficiency, and level of user satisfaction as proposed by the standards ISO/IEC 25010 and ISO/IEC 25022 [19], [20]. Based on the results obtained, it was determined that users with a teacher profile presented 87.11% effectiveness, 80.08% efficiency, and 7.53 satisfaction concerning SiGOAM and MLOR. These three results are considered satisfactory.

On the other hand, users with a student profile presented 80.00% effectiveness, 77.30% efficiency, and 7.37 satisfaction regarding the mobile application (MoApp). Therefore, the efficiency and satisfaction scores are classified as satisfactory, and the efficiency as acceptable. Based on the feedback from users who participated in the usability evaluation, improvements and extensions were carried out to achieve a higher degree of usability of the mobile learning platform. The usability evaluation described in this work can serve as a reference for developers seeking to improve learning monitoring and personalization services development.

This paper is organized as follows: Section 2 presents the state-of-the-art usability evaluations applied to mobile learning tools focused on learning monitoring and/or customization. Section 3 describes the usability evaluation carried out, and the description of the evaluation instruments used in the said evaluation. Section 4 presents the analysis of the results obtained by the usability evaluation. Section 5 describes the improvements and extensions made to the platform, considering the results obtained from the usability evaluation. Finally, the conclusions and future work are presented in section 6.

2. Related Work

Some works proposed in the specialized literature have carried out usability evaluations of mobile learning systems [7], [21]-[27]. These works used diverse evaluation approaches and aspects,

including different evaluation instruments. A comparative analysis shown in Table 1 is presented below.

Table 1. Comparative table of related works to usability evaluations.

[7] [21] [22] [23] [24] [25] [26] [27]

General characteristics Web V V

Mobile V V V V V V

Learning monitoring services V V V V V

Learning reinforcement services V V V V V

Learning customization services V V V

Field study V V V

Laboratory study V V V V V

Nielsen Decalogue Show system status V V

Maintain consistency between system and reality V V

Give the user full control V V

Stick to standards and be consistent V V

Prevent errors V V

Let the user choose instead of requiring them to remember things V V

Ensure flexibility and efficiency V V V V

Take care of aesthetics and moderation V V

Ensure effective erroi handling V V

Provide support and documentation V

ISO/IEC 25010 Effectiveness V V V V V V

Efficiency V V

Satisfaction V V V V V V

Freedom from risk

Context coverage

Evaluation instruments Questionnaire proposed at work V V V V V

Observation V

Usability Scaling System (SUS) V V V

USE questionnaire V

Video recording V V

Group interview V

Based on the works reviewed, Table 1 shows a comparative analysis in three aspects: learning services evaluated, usability aspects/characteristics evaluated, and evaluation instruments used. The

target systems of the usability evaluations proposed in the works reviewed offer different services that were grouped into learning monitoring, reinforcement, and customization services. In this aspect, we identified three works [7], [23-24] that carry out usability evaluations of learning customization services; however, two works only assess the characteristics of effectiveness and satisfaction proposed by the ISO/IEC 25010 standard.

On the other hand, the usability aspects evaluated in the studies reviewed were grouped into the characteristics mentioned in the Nielsen decalogue proposed in [29-30] and those suggested by the ISO/IEC 25010 standard [19]. Although the ISO / IEC 25010 standard proposes five characteristics to be evaluated, it was found that for usability evaluations of mobile learning systems, it is frequent to evaluate the characteristics of effectiveness and satisfaction, and only [21] and [26] evaluate three characteristics: effectiveness, efficiency, and satisfaction. Regarding the evaluation instruments used, the works proposed in [21], [22], and [25]-[26] used questionnaires that allow the evaluation of characteristics proposed in the ISO/IEC 25010 standard.

On the other hand, the works proposed in [7], [21], and [26] used the instrument called the usability scale system proposed in [29] that allows calculating the degree of user satisfaction with the evaluated software. Other evaluation instruments identified in the works are the observation by the evaluators making notes [27], video recording of the test carried out [26], and group interview [22]. Finally, we note that there is a lack of works that carry out a usability evaluation of learning monitoring and customization services based on a laboratory study with users to evaluate the characteristics of effectiveness, efficiency, and satisfaction proposed by ISO/IEC 25010.

3. Usability Evaluation Description

In this work, we carry out a usability evaluation of the learning monitoring and personalization services of the mobile learning platform proposed in [18] based on a laboratory study in which nine teachers and ten students participated. In the subsequent sections, we present the details of the usability evaluation carried out, starting with the description of the mobile learning platform, the definition of the laboratory studies, the description of the instruments generated for the evaluation, and ending with the details on the execution of the evaluation.

3.1 Mobile Learning Platform

The mobile learning platform presented in [18] is characterized by the related works proposed in the specialty literature by the following aspects:

a) Consider learning styles while incorporating mechanisms to obtain students' physical activity to provide recommendations of learning objects related to student preferences and conditions;

b) Provide recommendations of learning objects that were useful to other students with equal or similar learning styles;

c) Allow monitoring of the studied learning objects by students; and

Fig. 1. Mobile Learning Platform components: SiGOAM, MLOR, and MoApp d) Integrate three tools in a single platform: a mobile learning object generator system (SiGOAM), a mobile learning object repository (MLOR), and a mobile application (MoApp); which allows

carrying a complete flow from when the professor designs and creates a mobile learning object, until the student consults it from a mobile device, and subsequently obtains feedback and self-assessment, see Fig. 1.

The integration of these three components allows and facilitates the professor to implement strategies for learning monitoring and customization systematically.

3.1.1 SiGOAM Description

SiGOAM offers the professor multiple services grouped into four modules to generate quality MLOs: analysis, design, development, and product-oriented tests. The analysis module allows the professor to obtain a guide for acquiring information and the production of the activities composing the MLO, see Fig. 2. In the design module, the professor can esthetically build the educational content obtained in the analysis module. On the other hand, the development module allows the professor to generate a functional MLO prototype. Finally, in the module on product-oriented tests, the professor can assess the technological, pedagogical, and usability aspects of the MLO to obtain feedback from the students and improve it. The learning objects generated by SiGOAM have the following general structure: introduction, lessons, examples, exercises, and evaluations.

Fig. 2. Interface to consult and add resources [18]

3.1.2. MLOR Description

The MLOR component allows teachers to store, catalog, consult, and visualize MLOs generated by SiGOAM, as well as classify MLOs according to the learning style and context defined by the professor when building them. The MLOR also allows the professor to monitor every mobile learning object consulted by students, including the time invested in each of them; see Fig. 3.

Fig. 3. Detail of learning objects studied [18]

3.1.3. AppMo Description

Finally, the mobile application (AppMo) allows the student to consult, view, interact and recommend MLOs based on their learning style and specific context. For example, if a student is moving, ideally, the application recommends learning objects that do not involve lessons in which the student has to read text on the screen. Therefore, the application will recommend learning objects 94

that include videos and audios. The application also allows students to evaluate learning objects, considering how useful it was for them to understand the studied topic (see Fig. 4).

Fig. 4. Recommended learning objects by students [18]

3.1.4. Learning Monitoring and Customization Services

The main learning monitoring services provided to the professor by the platform are

a) Building mobile learning objects based on the student's learning styles and context;

b) Monitoring the learning objects utilized by the students, as well as the total time invested and the times the student consulted a learning object; and

c) Suggest new learning objects by the professor to reinforce a particular subject difficult for the student.

On the other hand, the main learning customization services offered by the platform are:

a) Identify student learning styles by applying the Honey - Alonso questionnaire (CHAEA) [31];

b) Obtain the context of the student based on the data collected by the sensors of the mobile device to determine the activity that the student is doing, which can be at rest or in motion;

c) Recommendation of learning objects based on the learning styles and physical activity of the students, and

d) Recommendation of learning objects evaluated by other students with the same learning styles. 3.2 Laboratory Studies

In this work the laboratory studies were designed to evaluate the usability of SiGOAM, MLOR, and MoApp from the point of view of their respective end-users. Usability evaluations based on laboratory studies have the following advantages:

• Identification of problems to improve the design of the software.

• Confirm or question assumptions made during the design process.

• Collection of quantitative data, for example, how long it takes users to complete a task and the number, type and severity of errors they make.

• Feedback from target users.

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

• Detection of usability problems before the launch of a software product.

• Time and cost savings when dealing with concerns.

• Obtain information on user satisfaction regarding the software before its general launch.

• Validate usability requirements.

• Impartial evaluation of the software.

In our case, we carry out a laboratory study to evaluate the usability of the SiGOAM and MLOR services, and another laboratory study to evaluate the usability of the MoApp services. Details are described in the following subsections.

3.2.1 Laboratory Study of the Usability Evaluation Applied to SiGOAM and MLOR

This section describes the scenario of the laboratory study that frames the usability evaluations of SiGAOM and MLOR. In the SiGOAM authoring tool, the teacher user builds an MLO using the services provided by SiGOAM in four phases: analysis, design, development, and product-oriented testing. Afterward, the teacher user publishes and distributes in the MLOR the MLOs generated from SiGOAM.

3.2.1.1 Case Study Definition

This section addresses the main aspects of the case study designed to evaluate the SiGOAM and MLOR.

The object of study: The objects of study were the SiGOAM and MLOR. SiGOAM can be accessed through the link: http://sigoam.lania.mx/login and MLOR can be accessed through the link: http://roa.lania.mx. With this, the accessibility and availability of the platform were remotely guaranteed.

Purpose: To evaluate the characteristics of effectiveness, efficiency, and level of satisfaction proposed in the ISO / IEC 25010 [19] and ISO / IEC 25022 [20] standards, through a laboratory study with a teacher profile.

Quality approach: The aspects evaluated were effectiveness, efficiency, and level of the user satisfaction of the SiGOAM and MLOR.

Perspective: Obtain users' point of view with a teacher profile through comments and suggestions Context: The experiment was carried out virtually through individual video calls with each participant and with the support of a total of nine teachers: seven from the Teaching Center of the National Laboratory of Advanced Informatics (LANIA), located in the city of Xalapa, Veracruz -Mexico, and two from the faculty of the Polytechnic University of Puebla, located in the city of Puebla, Puebla-Mexico.

The objective of the laboratory study definition was to analyze the flow that the construction, publication, and distribution of the MLOs entails to evaluate the usability in terms of effectiveness, efficiency, and user satisfaction with respect to the SiGOAM and MLOR from the point of view of the professor users from LANIA and the Polytechnic University of Puebla.

3.2.1.2 Planning

This section details the activities carried out for applying the usability evaluation. Selection of subjects: we select seven professors from the LANIA Teaching Center, representing 70% of the staff, with intermediate and advanced knowledge in using computers and mobile devices. In addition, two professors from the Polytechnic University of Puebla participated. Two participants expressed being MLO users with a teacher profile. The participants were selected to gather their perspectives regarding SiGOAM and MLOR to identify and address elements for improvement. Table 2 presents the main characteristics of the participants in the usability evaluation with a teacher profile. Four participants are male (44.44%), and the remaining five are female (55.56%). Most of the participants (55.56%) are between 36 and 45 years old. Two participants have master's degrees (22.22%), and seven with doctoral studies (77.78%).

Table 2. Characteristics of the participants with a teacher profile

Characteristics Number of participants

Gender

Man 4

Woman 5

Age

26 - 35 1

36 - 45 5

46 - 55 3

Level of studies

Master's degree 2

Doctorate 7

3.2.1.3 Design of the Experiment

The general elements regarding the design of the experiment are presented below. Randomization: The tasks to be carried out by the teachers were not assigned randomly. For the construction of the MLO during the evaluation test, it was necessary to ask each participant to have on hand a topic of their choice with an introduction, examples, exercises, and a brief evaluation. For the distribution of the MLOs, the same ones generated in the SiGOAM usability evaluation were used. For the evaluation, it was necessary to previously register test students in the MLOR so that the teachers could carry out the requested tasks. The details of the tasks performed by the teachers are those described in instrument I section.

3.2.2 Laboratory Study of the Usability Evaluation Applied to MoApp

This section describes the scenario of the laboratory study under which the usability evaluation of the mobile application (MoApp) was carried out with student-profile users using MLOs generated according to their context, i.e., if the student is at rest or in motion, and their predominant learning styles.

3.2.2.1 Case Study Definition

This section addresses the main aspects of the case study designed to evaluate the MoApp. The object of study: The object of study is the MoApp developed on Android. Each user was asked to install the application on their mobile device.

Purpose: To evaluate the characteristics of effectiveness, efficiency, and level of satisfaction proposed in the ISO / IEC 25010 [19] and ISO / IEC 25022 [20] standards, through a laboratory study with real users playing the role of students.

Quality approach: The evaluated aspects are effectiveness, efficiency, and level of user satisfaction concerning the MoApp.

Perspective: Obtain the users' point of view with the student role through comments and suggestions. Context: The experiment was carried out virtually through individual video calls with each participant. It supported ten students with a master's degree in Applied Computing from the Teaching Center of the National Laboratory for Advanced Informatics, located in Xalapa VeracruzMexico.

Therefore, the definition of the laboratory study is to analyze the use of MLOs by students to evaluate usability in terms of effectiveness, efficiency, and satisfaction.

3.2.2.2 Planning

This section details the activities carried out to apply the usability evaluation. Selection of subjects: For the evaluation of the mobile application (MoApp), an open invitation was launched using an email to which ten students of the master's in applied computing of the Teaching Center of the National Laboratory of Advanced Informatics responded. The only requirement to participate was to have a mobile device with an Android operating system. Their participation was voluntary and was not conditioned on any type of benefit for their subjects. Table 3 shows the main characteristics of the participants in the usability evaluation with the student profile. Nine men (90%) and one woman (10%) participated in the evaluation. Most of them are between 18 and 25 years old (60%). They all have a bachelor's degree.

Regarding the Android versions of the mobile devices, they used for the usability test; two were Android 9 (20%), five were Android 10 (50%), and three with Android 11 (30%).

Table 3. Characteristics of the participants with a student profile

Characteristics Number of participants

Gender

Man 9

Woman 1

Age

18 - 25 6

26 - 35 3

36 - 45 1

Level of studies

Bachelor's degree 10

Android version

9 2

10 5

11 3

3.2.2.3 Design of the Experiment

The general elements regarding the design of the experiment are presented below. Randomization: The tasks that the students had to perform for the usability evaluation were not assigned randomly. The tasks that the students carried out are described in instrument III. The OAMs used by the students were the same that the teachers constructed and published in the laboratory study described for SiGOAM and MLOR.

3.2.2.4 Operation

Teachers and students were not informed about the usability characteristics to be evaluated, they were only announced that the purpose of the study was to analyze and assess the quality in use of the mobile learning platform. Afterward, teachers and students signed an informed consent document. Therefore, no platform training was carried out for the execution of the usability test.

3.3 Evaluation Instruments

Based on the characteristics and metrics proposed in the ISO / IEC 25010 [19] and ISO / IEC 25022 [20] standards, three evaluation instruments were made. The instruments generated to carry out the usability evaluation can be classified into:

• Instrument to collect data to evaluate the effectiveness and efficiency of SiGOAM and MLOR from the teachers' perspective, hereinafter Instrument I.

• Instrument to collect data to evaluate the effectiveness and efficiency of MoApp of the mobile learning platform from the perspective of the students, hereinafter Instrument II.

• Instrument to collect data that allows obtaining the degree of satisfaction of the users of the mobile learning platform and identifying the characteristics to be improved on the platform, hereinafter Instrument III.

3.3.1 Instrument I Description

This instrument was used to measure the effectiveness and efficiency of users with a teacher profile to the Mobile Learning Object Generator System and the Learning Object Repository, specifically evaluating the learning monitoring and personalization services. Table 4 shows the breakdown of the services included in the instrument I grouped by tasks.

Table 4. The instrument I task breakdown

Task Subtask

Analysis module Create a new MLO

Register the student's profile

Add content

Add examples

Add exercises

Record metadata

Design module Design content

Design examples

Design exercises

Design evaluation

Development module Build MLO

Publish MLO

Product-oriented assessment module Usability assessment

Assessment of technological and pedagogical aspects

Publish MLO Complete MLO publication

Manage group Register new group

Add category to group

Add student to group

Add MLO to category

Consult studied MLO

Reinforcement Suggest MLO for reinforcement

3.3.2 Instrument II Description

This instrument was used to evaluate the effectiveness and efficiency of the students concerning the platform's mobile application (MoApp). In addition, learning monitoring and personalization services were specifically evaluated. Table 5 shows the breakdown of the services included in instrument II, grouped by tasks.

Table 5. The instrument II task breakdown

Task Subtask

Set learning style Answer the CHAEA questionnaire

Consult learning style

Set preferences

Consult MLO Consult added MLO

Consult reinforcement MLO

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Consult recommended MLO

Interact with the MLO View detail of MLO

View of MLO

Evaluate MLO

Consult best evaluated MLO

Support Consult teachers

Send mail to teacher

3.3.3 Instrument III Description

This instrument was used to evaluate the degree of satisfaction of users with the teacher and student profile. An adaptation of the user satisfaction questionnaire (QUIS) was carried out in version 7.0 proposed in [32]. This consists of thirty-two questions grouped into six categories, which are:

• Global reaction to the system. It presents the user with questions about his perception regarding utility, flexibility, ease, and other general aspects of the system.

• General reactions on the screen. It collects information to evaluate screen characteristics such as typography, design, distribution, and sequence between windows.

• System information and terminology. It contains questions that seek to evaluate the concepts used to determine if they are useful for the user to complete the tasks within the system.

• Learning capacity. Question the user regarding the ease of learning to use the software.

• System capabilities. It collects information that allows knowing the performance and recovery between errors made by the user.

• Ease of use and user interface. Question the user about general aspects of the software interface design.

3.4 Execution of the Evaluation

This section details the activities to carry out the usability evaluation.

3.4.1 Execution of Usability Evaluation with Teachers

For the evaluation with teachers, the experiment lasted approximately an hour and a half, framed in a video call, where the teachers carried out the tasks described in instruments I and II. For each task carried out, a note was made of the start and end time of the task or abandonment in case of not completing it. After the usability test, the user satisfaction questionnaire was applied (QUIS 7.0) [32]. Participants were asked to comment out loud on the problems they encountered, their impressions, and what they were trying to do on the platform, to obtain a video recording of each of their comments regarding their experience with the platform.

3.4.2 Execution of Usability Evaluation with Students

For the usability evaluation from the students' perspective, the experiment lasted forty-five minutes through a video call, where the students performed the tasks described in instrument II. The start time of each task was controlled, as well as the completion time of the task or abandonment in case of not completing it. After the usability test, the user satisfaction questionnaire (QUIS 7.0) proposed in [32] was applied. Previously, each participant was asked to install MoApp of the learning platform on their mobile device with the Android operating system. The test was videotaped to analyze each of the comments that the participants were making at the time of carrying out the requested tasks.

4. Analisys of Results

This section describes the results obtained after conducting the usability evaluation in terms of effectiveness, efficiency, and satisfaction.

4.1. Results of Instrument I

The instrument I was applied to users with a teacher profile, who were asked to perform seven tasks distributed between the SiGOAM and MLOR applications. Each participant was asked to fill out this instrument to determine their effectiveness and efficiency for the platform. The results obtained indicate that users present 87.11% average effectiveness and 84.08% average efficiency for SiGOAM and MLOR, respectively. Here is a detailed description of the results.

4.1.1 Effectiveness Results

The instrument I allowed to obtain the effectiveness value of the users for SiGOAM and MLOR. Therefore, the results obtained for each one are described in detail below.

4.1.1.1 Results of SiGOAM

In Table 6 it is indicated with a 1 in case the user has completed the task successfully and with a 0 otherwise. To obtain the efficiency value per user, the tasks that were completed were counted and the value obtained was divided by the total number of attempted tasks. In SiGOAM, monitoring and customization services were evaluated in tasks grouped by analysis, design, development, and product-oriented testing modules. Only five of the nine participants (55.56%) completed all the tasks successfully, while the others failed one of the tasks (44.44%). Table 6. Results of completion of task in SiGOAM of users with a teacher profile

Task Task completion

U1 U2 U3 U4 U5 U6 U7 U8 U9

Analysis module 0 1 0 1 1 1 1 1 1

Design module 1 1 1 1 1 1 1 1 0

Development module 1 0 1 1 1 1 1 1 1

Product-oriented 1 1 1 1 1 1 1 1 1

assessment module

Efficiency value 0.75 0.75 0.75 1.00 1.00 1.00 1.00 1.00 0.75

Considering the task completion values in Table 6 and substituting these values in the effectiveness formula, we have:

S0.75 + 0.75 + 0.75 + 1.00 + 1.00 + 1.00\ SiGOAM Effectiveness = (-+1.00 + 1.00 + 075-) x 100,

SiGOAM Effectiveness = ^J x 100 = 88.89. 4.1.1.2 Results of MLOR

Table 7 presents the tasks corresponding to the MLOR and indicates with 1 those that users could complete and with 0 those that users did not complete successfully. In this regard, the effectiveness value is obtained by dividing the number of completed tasks by the total number of attempted tasks. The activities evaluated in the MLOR were grouped into the tasks: publish MLO, group management, and reinforcement. Five of the participants managed to complete all these tasks without problems (55.56%), while the others had difficulty completing any of the tasks (44.44%). Table 7. Result of completion of task MLOR of users with a teacher profile

Task Task completion

U1 U2 U3 U4 U5 U6 U7 U8 U9

Publish MLO 1 1 1 1 1 1 1 1 1

Manage group 0 0 1 0 1 0 1 1 1

Reinforce-ment 1 1 1 1 1 1 1 1 1

Efficiency value 0.67 0.67 1.00 0.67 1.00 0.67 1.00 1.00 1.00

Considering the task completion values in Table 7, and substituting these values in the effectiveness formula, we have:

0.67+0.67+1.00+0.67+1.00 + 0.67+1.00+1.00+1.00

MLOR Effectiveness = ( ) x 100,

MLOR Effectiveness = x 100 = 85.33.

4.1.1.3 Average Effectiveness Result

We obtained the final effectiveness result through the average of the effectiveness values in both SiGOAM and MLOR. The value obtained indicates that users with a teacher profile managed to complete the tasks in SiGOAM and MLOR with 87.11% effectiveness, which is considered a satisfactory value according to the ranges of acceptability for effectiveness reported in [33].

4.1.2 Efficiency Results

The instrument I allowed obtaining the efficiency value of users regarding SiGOAM and MLOR. Therefore, the results obtained for each one is described in detail below.

4.1.2.1 Results of SiGOAM

Table 8 shows the times in minutes that each user with a teacher profile took to complete each task of the SiGOAM described in instrument I. For the case of users who could not complete the task, the time in which they abandoned the task is indicated.

Table 8. Results of times of users with teacher profiles in SiGOAM

Task Total time in minutes

U1 U2 U3 U4 U5 U6 U7 U8 U9

Analysis module 29 29 45 60 39 31 34 47 22

Design module 9 11 6 6 6 6 6 15 6

Development module 3 9 2 2 2 3 2 3 2

Product-oriented 4 2 2 6 2 3 2 4 3

assessment module

The user efficiency value for SiGOAM considers the task completion rate, in addition to the completion times shown in Table 8. For this, the efficiency value per task was calculated. In this concern, we averaged the efficiency values of the tasks: T1, T2, T3, and T4. Obtaining that user with a teacher profile completed the SiGOAM tasks with 84.34% efficiency. Data for the calculation are the next:

Efficiency T1: 77.98 Efficiency T2. 91.55 Efficiency T3: 67.86 Efficiency T4: 100.00

77.98 + 91.55 + 67.86 + 100 SiGAOM Efficiency =---,

434.42

SiGOAM Efficiency = -= 84.34 %

4.1.2.2 Results of MLOR

Table 9 shows the times, expressed in minutes, that each user with a teacher profile took to perform the MLOR tasks described in instrument I. For the case of users who could not complete the task, the time in which they abandoned the task is indicated.

Table 9. Results of users with teacher profiles in MLOR

Task Total time in minutes

U1 U2 U3 U4 U5 U6 U7 U8 U9

Publish MLO 4 3 3 4 2 3 1 6 3

Manage group 8 10 5 8 8 8 5 9 9

Reinforce-ment 4 2 1 7 2 5 4 10 4

To obtain the efficiency value of the users concerning the MLOR, in addition to considering the task

completion rate, the completion times are shown in Table 9. For this, the efficiency value per task

was calculated. The average of the efficiency values of the tasks was obtained: T5, T6, and T7.

Obtaining those users with a teacher profile completed the MLOR tasks with 83.81% efficiency.

Data for the calculation are the following:

Efficiency T5: 100.00

Efficiency T6: 51.42

Efficiency T7: 100.00

100 + 51.42 + 100 MLOR Efficiency = ---,

251 42

MLOR Efficiency = 25142 = 83.81 %. 4.1.2.3 Final Efficiency Result

Finally, to obtain the efficiency value of the users for the SiGOAM and MLOR applications, the average of the efficiency values in SiGOAM and efficiency in MLOR was obtained. The value obtained indicates that the users with a teacher profile completed the tasks in SiGOAM and MLOR with 84.08% of average efficiency, which is considered satisfactory according to the acceptability ranges for efficiency presented in [33].

4.2. Results of Instrument II

Instrument II was applied to users with a student profile, who were asked to perform four tasks in MoApp of the mobile learning platform focused on learning on monitoring and customization services. The students filled out instrument II to collect information to determine the effectiveness and efficiency of MoApp. The results obtained indicate that the users present 80.00% effectiveness and 77.30% efficiency for the evaluated mobile application.

4.2.1 Effectiveness Results

Table 10 indicates with a 1 if the user completed the task and with 0 otherwise. To obtain the efficiency value per user, the tasks that were completed were counted, and the value obtained was divided by the total number of attempted tasks. Only five users completed (50%) the four tasks in instrument I, while the remaining five users had problems completing at least one task (50%).

Table 10. Task completion results for users with a student profile

Task Task completion

U1 U2 U3 U4 U5 U6 U7 U8 U9 U10

Set learning style 1 1 1 1 1 1 1 1 1 1

Consult MLO 1 1 1 1 1 0 0 0 0 1

Interact with the MLO 1 1 0 1 1 1 0 0 0 1

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Support 1 1 1 1 1 1 1 1 1 1

Effectiveness value 1.00 1.00 0.75 1.00 1.00 0.75 0.50 0.50 0.50 1.00

Considering the task completion values in Table 10, and substituting these values in the effectiveness formula, we have:

1.00 + 1.00 + 0.75 + 1.00 + 1.00 +0.75 + 0.50 + 0.50 + 0.50 + 1.00

Effectiveness =

10

x 100,

/8.00\

Effectiveness = Il x 100 =

80.00.

The result obtained indicates that users with a student profile were able to complete the tasks in the mobile application with an effectiveness of 80.00%, which is considered satisfactory according to the acceptability ranges for effectiveness shown in [33].

4.2.2 Efficiency Results

Table 11 shows the times in minutes that each user with a student profile took to complete each task described in instrument II. For the case of users who could not complete the task, the time in which they abandoned the task is indicated.

Table 11. Results of times of users with student's profile

Task Total time in minutes

U1 U2 U3 U4 U5 U6 U7 U8 U9 U10

Set learning style 12 11 12 11 8 12 18 12 13 14

Consult MLO 9 4 6 4 4 3 11 5 7 5

Interact with the MLO 12 8 12 8 4 5 13 6 9 10

Support 2 1 2 2 1 2 2 2 2 3

The efficiency value of the users for MoApp considers the task completion rate indicated in Table 10 and the completion times in Table 11. For this, the efficiency value per task was calculated. The formula used to obtain the efficiency per task is described in [33].

In this concern, the efficiency values per task were averaged. Data for the calculation are the following:

Efficiency T1: 100.00 Efficiency T2: 55.17 Efficiency T3: 54.02 Efficiency T4: 100.00

To obtain the result for efficiency, we have the following:

100.00 + 55.17 + 54.02 + 100.00

Efficiency =

4

309.19

Efficiency = —-— = 77.30 %0.

The efficiency result obtained is 77.30%, which is considered satisfactory according to the acceptability ranges for efficiency shown in [32].

4.3. Results of Instrument III

Instrument III, which is based on the QUIS 7.0 proposed in [32], allowed us to determine the degree of satisfaction of teachers and students. It was obtained that the teachers achieved satisfaction of 7.53, while the student's satisfaction of 7.37. The degree of satisfaction is given from 0 to 9 as proposed in [32]. According to the acceptability ranges presented in [32], both scores are considered satisfactory.

The degree of satisfaction of the teachers for the SiGOAM and MLOR resulted from calculating the final average of instrument III from the perspective of the teacher user. The final average was obtained by adding each of the averages obtained by category and dividing the result among the total of categories. Data for the calculation are the following: Overall reaction to the software: 7.41 Screen: 7.75

Terminology and system information: 7.44

Learning: 7.24

System capabilities: 7.78

Usability and UI: 7.56

To obtain the result for satisfaction, we have the following:

7.41 + 7.75 + 7.44 + 7.24 + 7.78 + 7.56

Satisfaction = ---= 7.53.

6

The category with the highest score was the one that has to do with aspects of system capabilities.

In contrast, the one with the lowest score is the category of learning capabilities.

We calculated the final average of instrument III to obtain the degree of satisfaction of the students

for MoApp. The averages for each category were summed and divided by the total categories. Data

for the calculation are the following:

Overall reaction to the software: 6.97

Screen: 7.35

Terminology and system information: 7.83

Learning: 6.95

System capabilities: 7.32

Usability and UI: 7.80

To obtain the result for satisfaction, we have the following:

6.97 + 7.35 + 7.83 + 6.95 + 7.32 + 7.80

Satisfaction = ---= 7.37.

6

The category with the highest score is that of technology and information of the system, and the one that obtained the lowest score is the one that has to do with aspects of learning capacity.

5. Conclusions and Future

In this work, the usability evaluation of the mobile learning platform focused on learning monitoring and customization proposed in [18] was carried out based on a laboratory study to determine and improve its quality in use. Learning monitoring and customization services were specifically evaluated. The platform was evaluated under the characteristics and metrics proposed in the ISO/IEC 25010 and ISO/IEC 25022 standards. The evaluated characteristics were effectiveness, efficiency, and satisfaction from the users' perspective (teacher and student). For the evaluation, a laboratory study was designed for each platform element: SiGOAM, MLOR, and MoApp. Nine teachers and ten students participated in the laboratory study, with their comments and suggestions allowing us to identify usability issues that were corrected in their entirety. Based on the results obtained, it was determined that users with a teacher profile presented 87.11% effectiveness, 80.08% efficiency, and 7.53 satisfaction concerning SiGOAM and MLOR. These three results are considered satisfactory. On the other hand, users with a student profile presented 80.00% effectiveness, 77.30% efficiency, and 7.37 satisfaction regarding the mobile application (MoApp). Therefore, the efficiency and satisfaction scores are classified as satisfactory, and the efficiency as acceptable. In future work, we propose the integration of an adaptability engine [34] in terms of content, format, route, feedback, and evaluations.

References / Список литературы

[1 ] UNESCO, Personalized Learning, 2017. Available at:

https://unesdoc.unesco.org/ark:/48223/pf0000250057, accessed: Nov. 21, 2021.

[2] Y. Hernández-Velázquez, C. Mezura-Godoy, V.Y. Rosales-Morales. M-Learning and Student-Centered Design: A Systematic Review of the Literature. Advances in Intelligent Systems and Computing, vol. 1297, 2020, pp. 349-363.

[3] M. Abech, C.A. da Costa et al. A Model for Learning Objects Adaptation in Light of Mobile and Context-Aware Computing. Personal and Ubiquitous Computing, vol. 20, no. 2, 2016, pp. 167-184.

[4] S.S. Oyelere, J. Suhonen et al. Design, development, and evaluation of a mobile learning application for computing education. Education and Information Technologies, vol. 23, no. 1, 2018, pp. 467-495.

[5] S. Benhamdi, A. Babouri, and R. Chiky. Personalized recommender system for e-Learning environment. Education and Information Technologies, vol. 22, no. 4, 2017, pp. 1455-1477.

[6] H. Xie, D. Zou et al. Personalized word learning for university students: a profile-based method for e-learning systems. Journal of Computing in Higher Education, vol. 31, no. 2, 2019, pp. 273-289.

[7] D. Cáliz, J. Gomez et al. Evaluation of a usability testing guide for mobile applications focused on people with down syndrome (USATESTDOWN). Lecture Notes in Computer Science, vol. 10069, 2016, pp. 497502.

[8] S. S. Rani, S. Krishnanunni. Educational App for Android-Specific Users—EA-ASU. Advances in Intelligent Systems and Computing, vol. 1097, 2020, pp. 325-335.

[9] H. Imran, M. Belghis-Zadeh. PLORS: a personalized learning object recommender system. Vietnam Journal of Computer Science, vol. 3, no. 1, 2016, pp. 3-13.

[10] S. Saryar, S.V. Kolekar et al. Mobile learning recommender system based on learning styles. Advances in Intelligent Systems and Computing, vol. 900, 2019, pp. 299-312.

[11] K. Meenakshi, R. Sunder et al. An intelligent smart tutor system based on emotion analysis and recommendation engine. In Proc. of the 2017 International Conference on IoT and Application (ICIOT), 2017, pp. 1-4.

[12] L. G. Martínez, S. Marrufo et al. Using a Mobile Platform for Teaching and Learning Object Oriented Programming. IEEE Latin America Transactions, vol. 16, no. 6, 2018, pp. 1825-1830.

[13] Y. Li, L. Wang. Using iPad-based mobile learning to teach creative engineering within a problem-based learning pedagogy. Education and Information Technologies, vol. 23, no. 1, 2018, pp. 555-568.

[14] A. Sharma. A proposed e-learning system facilitating recommendation using content tagging and student learning styles. In Proc. of the 2017 5th National Conference on E-Learning &ELearning Technologies (ELELTECH), 2017, pp. 1-6.

[15] R.A.W. Tortorella and S. Graf. Considering learning styles and context-awareness for mobile adaptive learning. Education and Information Technologies, vol. 22, no. 1, 2Q17, pp. 297-315.

[16] T. Jagust, I. Boticki. Mobile learning system for enabling collaborative and adaptive pedagogies with modular digital learning contents. Journal of Computers in Education, vol. б, 2Q19, pp. 335-3б2.

[17] C. B. Yao. Constructing a User-Friendly and Smart Ubiquitous Personalized Learning Environment by Using a Context-Aware Mechanism. IEEE Transactions on Learning Technologies, vol. 1Q, no. 1, 2Q17, pp. 1Q4-114.

[18] C. Huerta-Guerrero, E. Lopez-Dominguez et al. Kaanbal: A Mobile Learning Platform Focused on Monitoring and Customization of Learning. International Journal of Emerging Technologies in Learning, vol. 1б, no. 1, 2Q2Q, pp. 18-43.

[19] ISO/IEC 25Q1Q:2Q11 Systems and software engineering — Systems and software Quality Requirements and Evaluation (SQuaRE) — System and software quality models, 2Q11.

[2Q] ISO/IEC 25Q22:2Q16 Systems and software engineering — Systems and software quality requirements and evaluation (SQuaRE) — Measurement of quality in use, 2Q16.

[21] A. A. Arain, Z. Hussain et al. Evaluating usability of M-learning application in the context of higher education institute. Lecture Notes in Computer Science, vol. 9753, 2Q16, pp. 259-2б8.

[22] B. A. Kumar, P. Mohite. Usability Study of Mobile Learning Application in Higher Education Context: An Example from Fiji National University. In Mobile Learning in Higher Education in the Asia-Pacific Region, Springer Singapore, 2Q17, pp. 6Q7-622.

[23] A. Pensabe-Rodriguez, E. Lopez-Dominguez et al. Context-aware mobile learning system: Usability assessment based on a field study. Telematics and Informatics, vol. 48, issue C, 2Q2Q, article no. Ю134б.

[24] D. Hariyanto, M. B. Triyono, and T. Köhler. Usability evaluation of personalized adaptive e-learning system using USE questionnaire. Knowledge Management & E-Learning: An International Journal, vol. 12, no. 1, 2Q2Q, pp. 85-1Q5.

[25] B. A. Kumar and P. Mohite. Usability guideline for mobile learning apps: An empirical study. International Journal of Mobile Learning and Organisation, vol. 1Q, no. 4, 2Q16, pp. 223-237.

[26] S. Yagmur and M. P. Çakir. Usability evaluation of a dynamic geometry software mobile interface through eye tracking. Lecture Notes in Computer Science, vol. 9753, 2Q16, pp. 391-4Q2.

[27] M. Asghar, I.S. Bajwa et al. A Genetic Algorithm-Based Support Vector Machine Approach for Intelligent Usability Assessment of m-Learning Applications. Mobile Information Systems, vol. 2Q22, 2Q22, Article ID í6Q9757, 2Q p.

[28] B. A. Kumar, P. Mohite. Usability of mobile learning applications: a systematic literature review. Journal of Computers in Education, vol. 5, no. 3, 2Q18, pp. 1-17.

[29] T. S. Tullis and J. N. Stetson. A Comparison of Questionnaires for Assessing Website Usability. In Proc. of the UPA 2QQ4 Conference, 2QQ4, pp. 1-12.

[3Q] N. Jacob. Usability Heuristics for User Interface Design. URL: https://www.nngroup.com/articles/ten-usability-heuristics/, accessed: Nov. 21, 2Q21.

[31] C. M. Alonso, D. J. Domingo, and P. Honey. Los estilos de aprendizaje, Procedimientos de diagnóstico y mejora. Séptima edición, Bilbao: Editorial Mensajero, 2007 (in Spanish).

[32] K. L. Chin, J.P., Diehl, V.A., Norman. Questionnaire for User Interface Satisfaction. Development of an Instrument Measuring User Satisfaction of the Human-Computer Interface. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems, 1988, pp. 213-218.

[33] ISO, Ergonomics of human-system interaction - Part 11 : Usability: Definitions and concepts. Iso/Np 924111, 2Q18.

[34] E.N. Chujkova, A.R. Aidinyan, O.L. Tsvetkova. Adaptation Algorithm for Application Menus. Programming and Computer Software, vol. 4б, no. б, 2Q2Q, pp. 397-4Q5 / E.H. Чуйкова, A.P. Лйдинян, О.Л. Цветкова. Aлгоритм адаптивного изменения меню программного приложения. Программирование, том 46, no. 6, 2020 г., стр. 30-4Q.

Информация об авторах / Information about authors

Herminio ДрЛЬ ÁNGEL-FLORES is an associate professor in the Department of Computer Science

at Laboratorio Nacional de Informática Avanzada (LANIA), in Veracruz, Mexico. He completed

her MSc degree at the National Laboratory of Applied Informatics (LANIA), Mexico in 2Q2i. His

areas of interest include mobile learning systems and user experience design.

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Эрминио ДЕЛЬ АНХЕЛЬ-ФЛОРЕС - доцент кафедры компьютерных наук. В 2021 году он получил степень магистра в Национальной лаборатории прикладной информатики (LANIA) в Мексике. В сферу его интересов входят мобильные обучающие системы и дизайн пользовательского интерфейса.

Eduardo LÓPEZ-DOMÍNGUEZ received the M.Sc. and Ph.D. degrees from the National Institute of Astrophysics, Optics and Electronics (INAOE), Puebla, Mexico. He is a researcher in the Department of Computer Science at Center for Research and Advanced Studies of the National Polytechnic Institute (CINVESTAV-IPN), CDMX, México. Ph.D. López Domínguez is a member of the Researchers National System Level 1 (SNI). His research interests include mobile distributed systems, partial order algorithms and multimedia synchronization.

Эдуардо ЛОПЕС-ДОМИНГЕС получил степени магистра и доктор философии в Национальном институте астрофизики, оптики и электроники (INAOE), Пуэбла, Мексика. Он является научным сотрудником отдела компьютерных наук. Является членом Национальной системы исследователей уровня 1 (SNI). Его исследовательские интересы включают мобильные распределенные системы, алгоритмы частичного порядка и синхронизацию мультимедиа.

Yesenia HERNANDEZ-VELAZQUEZ is a professor-researcher in the Department of Computer Science at Laboratorio Nacional de Informática Avanzada (LANIA), in Veracruz, Mexico. She completed her MSc Degree at the Benemérita Universidad Autónoma de Puebla (BUAP), Mexico in 2011. Her areas of interest include mobile learning systems and user experience design.

Есения ЭРНАНДЕС-ВЕЛАСКЕС - профессор-исследователь кафедры компьютерных наук. В 2011 году она получила степень магистра в Автономном университете Бенемерита в Пуэбле (BUAP), Мексика. Сферы ее интересов включают системы мобильного обучения и проектирование пользовательского интерфейса.

Saúl DOMÍNGUEZ-ISIDRO received the Ph.D. degree from the Universidad Veracruzana (UV), Mexico. He has been a full-time Professor-researcher with the Department of Computer Science at Laboratorio Nacional de Informática Avanzada (LANIA), in Veracruz, Mexico, since 2018. Ph.D. Domínguez-Isidro is a member of the Researchers National System Level 1 (SNI). His research interests include Combinatorial Optimization, Evolutionary Algorithms, Bio-Inspired Algorithms Optimization Algorithms, and Computational Intelligence.

Сауль ДОМИНГЕС-ИСИДРО получил докторскую степень в Университете Веракрусана (УФ), Мексика. С 2018 года он является штатным профессором-исследователем Департамента компьютерных наук. Домингес-Исидро является членом Национальной системы исследователей уровня 1 (SNI). Его исследовательские интересы включают комбинаторную оптимизацию, эволюционные алгоритмы, биоинспирированные алгоритмы, алгоритмы оптимизации и вычислительный интеллект.

María Auxilio MEDINA-NIETO is a researcher in the Postgraduate Department at the Polytechnic University of Puebla (UPPuebla), Puebla, Mexico. She completed her M.Sc. and Ph.D. Degree from the Universidad de las Americas Puebla, Mexico in 2008. She is a member of the Researchers National System (SNI) Level 1 since 2019 and she has a higher profile recognition (PRODEP). Her areas of interest include ontologies, semantic web, and open educational repositories.

Мария Ауксилио МЕДИНА-НИЭТО - научный сотрудник отдела последипломного образования. Она получила степени магистра наук и доктора философии в университете лас-Америкас, Пуэбла, Мексика. Она является членом Национальной системы исследователей (SNI) уровня 1 с 2019 года. В сферу ее интересов входят онтологии, семантическая сеть и открытые образовательные репозитории.

Jorge DE LA CALLEJA received the M.Sc. and Ph.D. degrees from the National Institute of Astrophysics, Optics and Electronics (INAOE), Mexico. He has been a full-time Professor with the Computer Science Department, Polytechnic University of Puebla (UPPuebla), Mexico, since 2008. Ph.D. De la Calleja is a member of the Researchers National System Level 1 (SNI), his research interests include machine learning, computer vision, and data mining with applications in medicine, education, and astronomy.

Хорхе ДЕ ЛА КАЛЬЕХА получил степени магистра и доктора философии в Национальном институте астрофизики, оптики и электроники (INAOE), Мексика. С 2008 года он является штатным профессором факультета компьютерных наук Политехнического университета Пуэблы. Де ла Каллеха является членом Национальной системы исследователей уровня 1 (SNI). Его исследовательские интересы включают машинное обучение, компьютерное зрение и интеллектуальный анализ данных с приложениями в медицине, образовании и астрономии.

i Надоели баннеры? Вы всегда можете отключить рекламу.