Научная статья на тему 'An enquiry-based approach to develop language skills in Mobile-Supported Classrooms'

An enquiry-based approach to develop language skills in Mobile-Supported Classrooms Текст научной статьи по специальности «Науки об образовании»

CC BY-ND
372
61
i Надоели баннеры? Вы всегда можете отключить рекламу.
Журнал
Journal of Language and Education
WOS
Scopus
ВАК
Область наук
Ключевые слова
MALL / MOBILE APPS / FORMATIVE ASSESSMENT / ENQUIRY-BASED LEARNING / IMMEDIATE FEEDBACK

Аннотация научной статьи по наукам об образовании, автор научной работы — Titova Svetlana, Samoylenko Olga

This article investigates the pedagogical impact of both the mobile-testing system PeLe and an enquiry-based approach to language skills development in the context of mobile-assisted language learning. The study aims to work out a methodological framework for PeLe implementation into a language classroom through immediate feedback and formative assessment. The framework was developed and pilot tested in a joint research project, MobiLL, by EFL teachers at Lomonosov Moscow State University (Russia) and University College HiST (Norway). The analysis based on quantitative research data demonstrated that PeLe-supported language classes resulted in language skill gains. The qualitative data analysis highlighted the positive effect of mobile formative assessment and of post-test activities on learner motivation and collaboration skills. This study suggests that the use of technology was effective in engaging students in enquiry-based tasks to cultivate collaboration.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «An enquiry-based approach to develop language skills in Mobile-Supported Classrooms»

National Research University Higher School of Economics Journal of Language & Education Volume 3, Issue 3, 2017

Titova, S., & Samoylenko, O. (2017). An Enquiry-Based Approach to Develop Language Skills in Mobile-Supported Classrooms. Journal of Language and Education, 3(3), 39-49. doi:10.17323/2411-7390-2017-3-3-39-49

An Enquiry-Based Approach to Develop Language Skills in Mobile-Supported Classrooms

Svetlana Titova

Lomonosov Moscow State University

Correspondence concerning this article should be addressed to Svetlana Titova, Faculty of Foreign Languages and Area Studies, Lomonosov Moscow State University, Lomonosovsky Prospekt, 31/1, Moscow, Russian Federation, 119192. E-mail: [email protected]

Olga Samoylenko

Far Eastern Federal University

Correspondence concerning this article should be addressed to Olga Samoylenko, Department of Education in Romance and Germanic Languages, Far Eastern Federal University, Nekrasova, 35, Ussuriysk, Primorsky krai, Russian Federation, 692508. E-mail: [email protected]

This article investigates the pedagogical impact of both the mobile-testing system PeLe and an enquiry-based approach to language skills development in the context of mobileassisted language learning. The study aims to work out a methodological framework for PeLe implementation into a language classroom through immediate feedback and formative assessment. The framework was developed and pilot tested in a joint research project, MobiLL, by EFL teachers at Lomonosov Moscow State University (Russia) and University College HiST (Norway). The analysis based on quantitative research data demonstrated that PeLe-supported language classes resulted in language skill gains. The qualitative data analysis highlighted the positive effect of mobile formative assessment and of post-test activities on learner motivation and collaboration skills. This study suggests that the use of technology was effective in engaging students in enquiry-based tasks to cultivate collaboration.

Keywords: MALL, mobile apps, formative assessment, enquiry-based learning, immediate feedback

Recent research has demonstrated the variety of educational benefits mobile technologies have for foreign language learning/teaching, including: an increase in learner autonomy (Murphy, Bollen, & Langdon, 2012); mobile networking collaboration (Lan, Sung, & Chang, 2007; Pemberton, Winter, & Fallahkhair, 2010); new formats for problem-solving, interactive tasks based on augmented reality (Cook, 2010; Driver, 2012); a more personalized learning experience (Petersen & Markiewicz, 2008; Oberg & Daniels, 2013); instant feedback (Voelkel & Bennett, 2013); immediate diagnosis of learning problems and design of new assessment models (Cooney & Keogh, 2007).

Today different types of mobile tools are used in

language learning/teaching to provide assessment and feedback such as mobile testing or assessment systems, discipline-based mobile apps, mobile clickers designed specifically to assess the level of student proficiency. However, in spite of the plethora of research in the area of mobile learning, instructors are challenged to examine how the pedagogical potential provided by mobile technologies relates to their teaching aims, methods, and subject matter. To date, there is no consistent mobile-assisted language learning (MALL) methodology, thus there is a great need for a new educational framework for mobile-testing apps implementation aimed at developing learner skills rather than just assessing learner knowledge.

This article is published under the Creative Commons Attribution 4.0 International License.

Materials and Methods

This study examines the extent to which the implementation of a mobile-testing system PeLe in the language classroom can be efficient in developing learner language skills.

The research question included a number of subquestions:

• What is the PeLe pedagogical potential to develop student language skills?

• Does PeLe intervention impact assessment patterns of the traditional classroom and foster the development of learner language skills?

• How can enquiry-based methods be effectively implemented into mobile assisted language learning model (MALLM)?

• Does the proposed MALLM impact student motivation and to what extent?

The hypothesis of this research is that enquiry-based learning and educational opportunities provided by handheld devices, such as interactivity and immediate feedback, could foster learner language skills.

The key objective of this research is to work out sound pedagogical strategies on how to implement the mobile-testing system PeLe in the traditional language classroom.

Literature Review

Today teachers who would like to meet the expectations of a new generation of digital natives need to follow a transformational approach (Puentedura, 2011) to the development of language skills based on creative use of mobile technologies within a learner-oriented environment. The main prerequisite for this environment to function is a collaborative peer-learning approach. In this social framework, learners' expertise and cultural practices gain importance as the role of the devices becomes less important. Mobility is no longer defined through the devices but through "the learners' abilities to act flexibly in ever changing and self-constructed learning contexts" (Seipold, 2011, p. 32). Seipold (2011) argues that only if teachers provide spaces for learners to act according to their interests, agency and cultural practices, can innovative uses for the devices be discovered by learners.

The research framework is also based on Mishra and Koehler's (2006) model for implementing new technologies into teaching - Technological Pedagogical Content Knowledge (TPCK). This approach suggests that teachers should aim to reach a point where their traditional content and pedagogical knowledge is enhanced by technological knowledge. According to the TPCK framework, a new tool complements teachers' knowledge and skills.

This theoretical perspective suggests that learning is affected and modified by the tools employed and that, reciprocally, these tools are adjusted through the ways they are used for learning. As Stockwell & Hubbard (2013) argue: "Let the language learning task fit the technology and environment, and let the technology and environment fit the task" (p. 9). The Substitution Augmentation Modification Redefinition model developed by Puentedura (2011) can be used as a complement to TPCK. According to this model, the use of new technology tools in education may lead either to the enhancement of education (augmentation and substitution phases) or to the real transformation (redefinition and modification phases). Redefinition is the highest transformation phase which allows for a completely new format for tasks and activities that were previously impossible. This approach also offers a perspective in which pedagogical considerations shape the design of mobile learning.

Another important theory that has been influential in defining the framework ofthis study is enquiry-based learning. Many researchers today highlight the social aspects of mobile technologies, proposing complex structures of m-learning pedagogy built on Vygotsky's hypothesis about the importance of discussions in an educational context (Laurillard, 2007; Sharples, Taylor, & Vavoula, 2007). Enquiry-based learning is a shift away from passive methods to enquiry-based methods in which students are expected to construct their own knowledge and understanding by taking part in guided processes of enquiry (Kahn & O'Rourke, 2005).

Danaher, Gururajan, and Hafeez-Baig (2009) propose the m-learning structure based on three principles: engagement, presence and flexibility. Presence is characterized as an interaction that is sub-divided into three types: cognitive (studentcontent), social (peer) and teaching (student-teacher). Kearney, Schuck, K. Burden, & Aubusson (2012) argue that the main constituents of m-learning pedagogy are personalization, authenticity, and collaboration. Mobile technologies enable instructors to create a collaborative environment that motivates students to learn for themselves, bringing a research-based approach to the subject. These interactive, "dialogic models of learning are similar to the processes of participation in research" (Sambell, 2010, p. 56).

Ubiquitous access to information via mobile devices potentially enables a paradigmatic shift in education; it changes the way classes are managed and the instructor's role (Betty, 2004). Kahn & O'Rourke (2005) argue that enquiry-based learning encourages students to seek out new evidence for themselves, and supports a peer learning approach. This approach implies a principal change in the paradigm of teaching due to the fact that mobile devices effectively "act as accelerators of the social discourse" (DeGani, Martin, Stead, & Wade, 2010, p. 181). Therefore, a common

thread through nearly all of the literature is the importance of combining mobile voting and testing tools with such constructivist approaches as peer- and enquiry-based learning.

Mobile Testing and Voting Tools in educational contexts: A brief overview

In one study, the Language Learning and Assessment System, 'Learnosity', was used to facilitate oral evaluation of L2 Irish via mobile devices linked to an audio server. 67% of the students reported having made progress in speaking Irish as a result of this pilot project (Cooney & Keogh, 2007). 'Talkback' is an interactive response system for assessing listening and speaking skills in university L2 French and L2 English learning. It presents recorded prompts to which students respond orally. The research demonstrated that Talkback offered ease of use and immediate feedback for learners (Demouy, Eardley, Shrestha, & Kukulska-Hulme, 2011). 'UbiSysTEST' is a ubiquitous testing system that provides an opportunity for creating, storing, and correcting tests: through UbiSysTest learners can download the mobile version of the tests (Lopes & Cortes, 2007).

Quite a few mobile voting tools (Xorro-Q, Mentimeter, MbClick, Socrative, PollEverywhere, SRS, etc..) are currently available on the market. These tools have some common technological characteristics to facilitate material presentation and feedback. Many researchers argue that they allow for anonymous participation and add a game approach to the classroom environment (Martyn, 2007); they can be used successfully in a small classroom as well as in a large one (Gilbert, 2005); they can turn multiple-choice questions into effective tools for engaging all students during class, students are more invested in participating in discussion and are more likely to have generated some ideas to share in that discussion (Bruff, 2009). Mobile voting tools can advance profound learning when teaching strategies center on higherlevel thinking skills (Dangel & Wang, 2008) and help design formative assessment activities (Rubner, 2012).

The research which has been done on the pedagogical impact of Student Research System (SRS) demonstrated that SRS supported approach influenced not only lecture design - time management, the mode of material presentation, activity switch patterns - but also learners-teacher interaction, student collaboration, and output, formats of activities and tasks. The vast majority of the research concurs that using mobile clickers in the classroom has a positive influence, especially on factors such as student motivation, engagement and peer learning potential (Caldwell, 2007).

Research Methodology

Mobile Language Learning (MOBILL) was an international project involving two institutions: S0r-Tr0ndelag University College (HiST, Trondheim, Norway), and Lomonosov Moscow State University (LMSU, Moscow, Russia). The project was conducted during two periods from September 2013 to January 2014 and from February 2014 to May 2014. The key objective of this research was to work out sound pedagogical strategies on how to implement the mobile testing system PeLe into the traditional language classroom (pedagogical perspective), thus introducing some improvements to the piloting tool (technological perspective). In the first period, teachers from LMSU and HiST piloted PeLe and tried to develop the mobile assisted language learning model (MALLM) based on enquiry-based approach and formative assessment.

The Pedagogical Potential of the Mobile Testing System PeLe to Enhance Language Learning

The Mobile Testing System (PeLe) for handheld devices, developed at S0r-Tr0ndelag University College, enables instructors to deliver a test through any mobile device, assess it and provide timely feedback to both individual students and a group of students immediately after a test. Using Pele, students can respond to tests electronically and the teacher can see on the screen what is happening during the test. Both the number of students that have answered and the percentage of correct answers to each question are continuously updated on the instructor's display. Feedback is created for each possible answer option. PeLe allows students to see on their own devices the individual score they got on the assessment. PeLe has SRS built in that can be used to give students a second chance in voting. The technological characteristics and the pedagogical potential of PeLe are summed up in Table 1.

Table 1

Technological characteristics and the pedagogical potential of PeLe

Technological characteristics of Pedagogical Potential

PeLe

Multiple choice tests can be • teaching in technologically

delivered either via PeLe or in the limited environments

written (paper) form • no need for profound tech

preparation

Instant visualization of the test • group dynamics evaluation

results • formative assessment

approach

• enhance learner

motivation

Immediate test assessment and • timely diagnosis of feedback teaching/learning problems

• instant feedback on learning problems in the large auditorium

• revision of teaching strategies to help the students to overcome obstacles

Student Response System (mobile • can be used to conduct voting tool) is installed in-class surveys, replacing

paper-based and on-line versions

• encourage peer discussions and post-test activities

• can be used to run voting sessions several times

• maintain students' attention longer

Database stores both group and • e-portfolio approach: individual results for each session evaluation of student held learning progress

• formative assessment approach

• easy to identify the performance of each student

• evaluation of group dynamics

• individual approach to each student

• e-portfolio approach: student reflection on the learning progress

Multiple-choice questions are • visualization of learning created that are a mixture of text materials and images • enhancing learner

motivation

The teacher can see on their screen • evaluation of student what is happening during the test learning progress

• any aspect of student output is under control and can immediately be drawn attention to

Equipment necessary: one • teaching in technologically internet-enabled teacher limited environments

computer and internet-enabled • no need for bulky costly student mobile devices equipment

Use of student own devices • no need for tech

instructions - familiar devices

PeLe is perfectly suited to evaluate both group dynamics and individual student results; it was primarily used in our research for formative assessment or low-stake assessment, which serves to give learners feedback on their performance and "provides them with a gauge of how close they are to reaching a pre-specified learning goal" (Sambell, 2010, p. 58). PeLe as well as SRS implementation allows for significant feedback pattern changes due to the fact that the test results can be provided almost instantly (Arnesen, Korpas, Hennissen, & Stav, 2013).

The methodological framework of the MALL model

Mobile Assisted Language Learning

Figure 1. The mobile assisted language learning model.

based on PeLe implementation includes both enquiry-based methods, such as collaborative and peer learning post-test activities, brainstorming, problem-solving activities, group discussions, and mobile learning opportunities such as immediate feedback, formative assessment, interactivity, and flexibility. The project implementation and research design are illustrated in Figure 1.

Data Collection

Data collection was carried out in three cycles that took place from September 2013 to January 2014:

(1) The intervention of PeLe tests as formative assessment tools from September to December 2013 in three experimental groups. Quantitative data of the first and the second voting of PeLe tests were analyzed by mean and standard deviation, students' t-test results. The grid on post-test activities was used after each test was completed by the teachers of the experimental groups.

(2) In addition, quantitative data of the final tests were gathered in control and experimental groups to compare overall performance at the end of the semester; the data were analyzed by mean and standard deviation, students' t-test results.

(3) The post-intervention questionnaire asked students to reflect on their attitudes to PeLe integration. Qualitative data were then gathered to help explain quantitative findings.

The Project Implementation and Research Design: Methodology of PeLe Intervention

We offered the following procedure of PeLe intervention employed in this study:

1. Setting up the assessment template

The teacher sets up the PeLe assessment template

It returns individual feedback to the student, direct to their mobile phone/device

choosing the number of questions, alternatives, and the correct answer. In this case, teachers have three options: they can use the printed version of the test, they can show the test on the WIB, or they can copy, paste and save the test on PeLe.

2. The test is on

Students take the multiple-choice test and the teacher gets the answers monitoring this process on the computer screen. When the students are picking the correct answers using their hand-held devices, the teacher can see the group dynamics and each student's test results. At this stage, the teacher can see what kind of problems students have both as a group and individually, and can detect what test questions are difficult which better enables them to choose the most suitable version.

3. Test submission and instant feedback

The teacher takes some time (usually about 3-4 minutes on a 20-question test) to analyze the test results then students are shown their group scores on each test item in the form of a diagram. The teacher chooses the items that were the most difficult ones for students. After that, the pedagogical agent selects scaffolding from the range of post-test activities: teacher explanation, a group discussion or brainstorming, a pair discussion or SRS-supported activity.

4. Post-test activities

At this stage, the teacher has to pick a post-test activity aimed at improving test results. The type of activity depends upon the group results on a test question. We offered the following activities: teacher explanation, group discussion or brainstorming, pair discussion. To figure out what types of posttest activities were the most effective ones, we asked teachers to fill in the grids on post-test activities used after each test.

5. Second voting on the tough questions

After post-test activities, students vote on the discussed questions using either SRS installed or PeLe. At this stage, the teacher can also provide immediate feedback comparing and demonstrating group results for the first and second trial of the test. The methodology of PeLe implementation is illustrated in Figure 2.

Participants and Methods

The project target groups consisted of Norwegian and Russian learners of English, all at approximately the same language level (B1) according to the Common European Framework for Languages. In the first period of experimentation, several groups of learners at HiST and at LMSU participated. For the purposes of this paper, only the data collected from the LMSU groups are discussed and analyzed. Students enrolled in a preparatory face-to-face English course at LMSU were randomly assigned to three experimental groups

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

and one control group. Students of the experimental groups took a series of PeLe-supported formative grammar and vocabulary tests as volunteers using handheld devices. The control group was tested by means of a pen and paper method (see Table 2).

Methodology of PeLe intervention: Cycle 1

Setting up the PeLe assessment temmplate

Test is on

V_

Results analysis and instant feedback

/ \ Post-test activities: grids (24) on post-

^ test activités used after each test_i—¡J

f--\

Second SRS or PeLe votingon tough

questions: Quantitative data on the results of

the first and second trial of the tough questions

are collected

Figure 2. The methodology of PeLe implementation. Table 2

The project target groups

Group Male Fe- Number En- En-

size male of trance trance

fomative test test

tests MS score

SD

Experimental 12 1 10 7 81 5,97

group 1

Experimental 8 1 7 7 64 8,75

group 2

Experimental 9 3 6 7 66 15.09

group 3

Control 7 0 7 0 60 7,09

group

Results and Discussion

Cycle 1

In the first cycle, the students of the three experimental groups took PeLe multiple-choice grammar and vocabulary tests according to PeLe methodology intervention (see Figure 2 above). In the experimental groups, formative Pele tests were provided in the form of in-class grammar and vocabulary tests. The test data were stored at the server of HiST, Norway. The tests included from 10 to 25 items. Students responded with their smartphones or tablets. They had access to PeLe tests by using the Wi-Fi in class. The students of the control group were taught in a more traditional way: they took the placement and final tests for summative assessment; they were not

supposed to take formative PeLe supported tests with immediate feedback on their results.

According to the methodology of PeLe intervention, we collected the quantitative data on the results of the first voting (FV) and the results of the second voting (SV) of the experimental groups (EXG) and analyzed the results. To assess the magnitude of any significant changes following the intervention, effect sizes were calculated according to the methods of mathematical statistics; the effectiveness of the experiment is characterized by the standard deviation (SD) and mean score (MS) for each group (see Table 3).

Table 3

The first and the second voting test results of the experimental groups

EXG 1 EXG 2 EXG 3

MS SD MS SD MS SD

FV Test 1 84 4.03 50 5.67 45 6.77

SV test 1 86 5.15 70 8.12 72 13.56

FV Test 2 86 5.61 40 6.56 65 8.67

SV Test 2 90 4.12 75 11.16 73 14.96

FV Test 3 88 6.45 68 7.89 78 6.67

SV Test 3 90 5.99 70 13.35 96 5.36

FV Test 4 90 4.89 40 10.11 56 11.14

SV Test 4 90 5.37 85 17.10 75 13.13

FV Test 5 90 4.04 62 13.02 70 12.56

SV Test 5 95 4.83 90 11.80 78 15.03

FV Test 6 90 4.56 65 5.56 45 12.44

SV Test 6 90 10.33 80 13.66 74 14.08

FV Test 7 68 11.13 71 6.78 40 12.55

SV Test 7 70 9.66 60 13.66 74 10.07

Based on these calculations, we can conclude that the effectiveness of the formative tests increased after post-tests activities. These results indicate that there was a substantial improvement in group 2 and group 3 (see Table 4) where the entrance test mean scores were 64 and 66 respectively, whereas there was not such a substantial improvement in group 1 where the entrance test mean result was much higher at 88. In other words, the PeLe supported approach proved to be more beneficial for the groups with a lower language level: FV MS1 = 56< SV MS1 = 75, FV MS2 = 57< SV MS2 = 77. Statistical differences between the first and second voting of the PeLe supported test in the experimental groups were assessed using students' t-test for independent samples, as appropriate. T-test results of the first and second voting in group 2 and group 3 statistically are quite significant: EXG2 T-test = 3,0512 EXG3 T-test = 3,2342. For group 1, T-test is 0,5023. According to conventional criteria, this difference is

considered to be not statistically significant. Table 4

The mean score, standard deviation, t-test of the first and second voting of the PeLe supported test in the experimental groups

FV MS SV MS FV SD SV SD FV and SV T-Test

EXG1 85 87 7,9 8,05 0,5023

EXG2 56 75 13,11 10,17 3,0512

EXG3 57 77 14,44 8,40 3,2342

As indicated above, our study demonstrated that the second voting results in the experimental groups were better than test results of the first trial. The main reasons for this could be the immediate feedback on group test results and the post-test activities offered by teachers in the experimental groups. Hattie and Timperley (2007) emphasize that effective feedback needs to provide information that specifically relates to the task so that students can develop error detection strategies and use the feedback to tackle more challenging tasks; this insight proves true in our study.

As for the types of the post-test activities that were the most effective, the teachers filled in the grids on the post-test activities used after each test. To answer the third question of our study, we first tried to figure out what kind of post-test activity led to an improvement in student performance, which then could be deemed as the most efficient one; second, whether any correlation existed between the group item score and the type of activity offered by the instructor. Our grid data analysis demonstrated that on average in each group 5.125 activities were offered after each test. The most frequently used activity by instructors was the class discussion (48 %), then comes teacher explanation (32,5 %), then group discussion activities (19,5 %), the least frequently used activity of all the activities (see Table 5).

Class discussions are likely used more frequently because it is easier for instructors to be class facilitators and to initiate discussions by asking open-ended questions that provoke further discussion, stimulate deeper exploration and challenge student thinking, encouraging them to seek new ways to work with problems and situations. But in this case, only carefully formulated questions can stimulate the generation of ideas and interest in what students have to say, providing clues as to whether students are "on track" (Kahn & O'Rourke, 2005). A peer tuition approach seems to be very time- and effort-consuming on the part of the teacher and furtherby Experimental language groups were not so numerous, so it was quite comfortable for the learners and the instructor to have a discussion.

Table 5

Number and type of post-test activities

Total The number average of activi- number ties of activities per _group

% N % N % N % N 5,125

EXG 1 34,4 11 53,1 17 12,5 4 26 32

EXG2 37,5 18 43,8 21 19,9 9 39 48

EXG3 25,6 11 48,8 21 25,6 11 35 43

All Groups 32,5 40 48 59 19,5 24 123

Our data analysis demonstratea that the class discussion was the most efficient post-test activity because the increase in the second trial test results was significant. These increases were statistically more significant for group 2: T-test = 3,053, FV MS2 = 56<SV MS2 = 75, and for group 3: T-test = 3,2342, FV MS3 = 57< SV MS3 = 77. In each group, 21 class discussion activities were used (see Table 6). These results substantiate the value of group discussions for student understanding, confirming the significance of a constructivist approach in mobile learning (Arnesen et al., 2013).

Table 6

Correlation between second voting test results and the post-activities offered

SV MS FV and Number of post- test activities

SV T-Test Teacher explanation Class Discussion Group discussion

EXG 1 87 0,5023 11 17 4

EXG 2 75 3,0512 18 21 9

EXG 3 77 3,2342 11 21 11

The next question we tried to figure out is the correlation between the group item score and the type of activity offered. In the methodology of PeLe intervention, we presumed that if results are scattered all over the place and about 50 % or more of the students gave incorrect answers, it meant that the class did not have a clue, so the teacher explanation should follow. If more than 50 % of the students answered correctly, the teacher could initiate either a group discussion or a pair discussion aimed to figure out the correct answer to the test statement. A pair discussion can be based on a peer-tuition approach wherein the student who answered correctly explains it to the other student.

Our grid data analysis demonstrated that if 50 % of the group members gave incorrect item answers, the instructors preferred teacher explanation technique;

if from 50 to 75 % of the group members gave the correct item answer, the instructors offered either pair or class discussion activities. In this case, the teacher choice depends on the two important considerations: whether it is necessary to give some additional input to initiate the discussion and whether the chosen activity helped save the time assigned for the test and its discussion.

Cycle 2

The learners of both control and experimental groups were given the same placement and final tests. These tests were used for summative assessment. The overall mean scores were included to compare overall performance of the control and experimental groups after the implementation of the intervention. The data collected on the mean scores of the entrance and final tests in control and experimental groups suggest that introduction of PeLe tests helped improve academic performance in the experimental groups in mean results of final test (see Table 7), whereas the control group demonstrated just a slight increase in mean scores (60>62). Statistical differences between the two tests were also assessed using Student's t-test for independent samples, as appropriate: t-test results in group 1 are 1.807, in group 2 - 2.6201, in group 3 - 1.2405. In the control group t-test is the lowest -0,7025. These data suggest that the introduction of PeLe supported approach helped improve student performance in all three experimental groups.

Table 7

Final and entrance test results in control and experimental groups

GROUPS Entrance Entrance Final Final T-TEST

test MS test SD Test MS test SD

EXG1 81 5.97 87 8.23 1.8027

EXG2 64 8.75 78 14.58 2.6201

EXG3 66 15.09 73 10.30 1.2405

CG 60 7.09 62 4.88 0,7025

These results support our hypothesis that collaborative enquiry-based learning and educational opportunities provided by handheld devices for formative assessment led to a significantly better exam performance for the students who took PeLe quizzes, compared to those who did not. The summative part of PeLe tests ensured a high completion rate, whereas the formative part provided students with prompt feedback and gave them information on what they needed to do to improve their performance (Voelkel, 2013). The increase in the overall exam results was encouraging, but not conclusive enough to show that only PeLe tests were beneficial.

Teacher Class Group explana- Discus- Discus-tion sion sion

Table 8

Results of the post-study questionnaire

Strongly agree Agree Disagree Strongly disagree Median Score

1. Mobile devices are the best tools to be used for language practice/testing. 17 4 3 0 3,58

2. PeLe tests helped me understand the topic in focus. 16 6 2 0 3,58

3. PeLe tests helped me get ready for midterm and finals a lot. 16 4 2 2 3,41

4. Instant feedback after PeLe tests was very supportive and encouraging for my learning. 22 2 0 0 3,91

5. Post-test activities (class and group discussions) made me better understand grammar rules. 14 6 4 0 3,41

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

6. Post-test activities were very helpful and timely. 17 5 2 0 3,62

7. Re-voting on SRS after post-test activities helped me correct my mistakes and reflect on my grammar practice. 11 6 3 4 3,00

8. Activity switching kept me engaged in class. 10 8 4 2 3,08

9. The use of mobile devices and tasks based on PeLe was fun and changed my attitude to learning. 10 8 4 2 3,08

10. PeLe tests were frustrating; they complicated my learning a lot. 0 0 4 20 1,16

11. PeLe based tests were motivating and challenging. 16 6 2 0 3,58

Cycle 3

During this cycle, an online survey was administered to elicit students' own perceptions of PeLe intervention. The quantitative data were supplemented by student feedback gained from a post-study questionnaire. The post-study questionnaire contains 11 questions aiming to gather student views on the strengths and weaknesses of PeLe intervention. This survey went well beyond a simple evaluation of learning experiences since the scores were used as a proxy for understanding how exposure to and use of PeLe by the students could impact their attitude toward mobile learning. The questionnaire was completed by 24 students (22 female, 2 male) of the experimental groups. Responses are provided in Table 8.

In response to statement ,1 students averaged 3,58. Overall, students expressed high levels of satisfaction with PeLe supported tests undertaken on mobile devices. The majority of students commented favorably on the fact that mobile tests helped them understand the topic in focus and get ready for midterms and finals with responses to the second statement (averaging 3,58) and to the third statement (3,41). The largely positive reaction to statements 4, 5 and 6 - where the mean scores were 3,91, 3,41 and 3,62 respectively -emphasizes that immediate feedback on test results

was very supportive and encouraging for student learning. Students appreciated the prompt feedback they got on their own understanding of material. They also mentioned that the group discussion time gave them a chance to learn from each other. In response to statement 7, the average was 3,00. The students were very positive about the fact that re-voting after posttest activities helped them correct their mistakes and promote reflection on grammar practice.

The positive reaction to statement 8, where the mean score was 3,08, indicates that activity-switch approach kept the students involved. Statements 9, 10 and 11 were designed to collect students' attitudes to PeLe intervention. The average for statement 9 was 3,08, with the majority of students claiming that the use of mobile devices was fun and changed their attitude to learning. This supports the idea that the availability of mobile devices for learners makes them an attractive supplement to other forms of teaching and learning a second language (Stockwell & Hubbard, 2013).

In reaction to statement 10, the majority of students disagreed on the fact that PeLe tests were frustrating and complicated their learning a lot. The average for statement 11 was 3,58, proving the idea that formative assessment tests promote feedback that seeks to empower students to become motivated

and committed to exercising more control over their own learning (Sambell, 2010).

Our findings suggest that students place heavy emphasis on the value of instant, timely feedback on their tests as well as on post-test activities that stood them in good stead in improving their grammar skills. However, despite the overall positive responses, there was also some notable ambivalence: several respondents combined positive comments with criticisms. For example, in answering statement 7, four students strongly disagreed with the idea that re-voting on tough test items after post-test activities helped them correct their mistakes and promote reflection on their grammar practice. Nonetheless, our analysis clearly demonstrates that the vast majority of students found PeLe implementation to be very appealing.

Conclusion

This study indicates that mobile apps integration into language learning could be efficient, especially if combined with enquiry-based and peer-learning approaches along with the pedagogical potential provided by mobile testing systems. The experimental results suggest that the MALLM approach combining m-learning and enquiry-based learning theory and formative assessment is most effective. The research results support our hypothesis that collaborative enquiry-based learning and educational opportunities provided by handheld devices for formative assessment lead to a significantly better exam performance for students who participate in formative PeLe quizzes, compared to those who do not.

Since our conclusions are based on subjective interpretation (surveys) and objective data (server logs and quiz results), this research has some limitations. Firstly, the number of participants was small; consequently, their reflections may not be equally applicable to all mobile learner perceptions. Secondly, PeLe enables the instructor to diagnose not only group performance but also results of each student because each individual's responses are also stored enabling an individual student's performance to be tracked across multiple sessions. One of the advantages of mobile and computer testing is that the test can be individually administered and be tailored to each student's ability levels. Although some students in our research appreciated the immediate feedback, some students said that they would like to get more personalized feedback.

We hope that our research will provide some

constructs for pedagogical thinking about enhancing MALL with new mobile-assisted assessment methodology. Formative assessment practices (Black & Wiliam, 2009) lay at the heart of the project's innovative pedagogic approach, offering a practical way of embodying assessment for language learning environments. Although mobile testing system PeLe holds promise, more research is needed to determine its effects upon developing not only grammar skills but also some other skills such as speaking, writing, listening.

Acknowledgements

We would like to express our sincere gratitude to the MobiLL project team, our Norwegian colleagues, for making this research happen - Professor John Stav and Ekaterina Zourou for initiating the project and sharing ideas on mobile tools integration into language classroom; Arild Smolan and Ketil Arnesen for administrative and financial support; all the teachers who participated in the project - Anna Avramenko, Anna Lyubomskaya, Even Einum - for providing feedback on PeLe implementation.

References

Arnesen, K., Korpas, G. S., Hennissen, J., & Stav, J. B. (2013). Experiences with use of various pedagogical methods utilizing a student response system -motivation and learning outcome. The Electronic Journal of E-Learning, 11(3), 169-181. Beatty, I. (2004). Transforming student learning with classroom communication systems. EDUCAUSE Research Bulletin, 3. Boulder, CO: ECAR. Retrieved from https://library.educause.edu/ resources/2004/2/transforming-student-learning-with-classroom-communication-systems Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability, 21, 5-31. Bruff, D. (2009). Multiple-choice questions you wouldn't put on a test: Promoting deep learning using clickers. Essays on Teaching Excellence, 21(3), 25-34. Retrieved from http://podnetwork.org/ content/uploads/V21-N3-Bruff.pdf Caldwell, J. (2007). Clickers in the large classroom: Current research and best-practice tips. Life Sciences Education, 6(1), 9-20. doi:10.1187/cbe.06-12-0205

Cook, J. (2010). Mobile phones as mediating tools within augmented contexts for development. International

Journal of Mobile and Blended Learning, 2(3), 1-12. doi: 10.4018/jmbl.2010070101 Cooney, G., & Keogh, K. A. (2007). Use of mobile phones for language learning and assessment for learning, a pilot project. In Proceedings of the 6th Annual International Conference on Mobile Learning (pp. 46-50). Melbourne, Australia: VIC. Retrieved from http://iamlearn.org/mlearn-archive/mlearn2007/ files/mLearn_2007_Conference_Proceedings.pdf Danaher, P. A., Gururajan, R., & Hafeez-Baig, A. (2009). Transforming the practice of mobile learning: Promoting pedagogical innovation through educational principles and strategies that work. In H. Ruy & D. Parsons (Eds.), Innovative mobile learning: Techniques and technologies (pp. 21-46). Hershey, PA: IGI Global. doi: 10.4018/978-1-60566-062-2.ch002

Dangel, H. L., & Wang, C. X. (2008). Student response systems in higher education: Moving beyond linear teaching and surface learning. Journal of Educational Technology Development and Exchange, 1(1), 93-104.

DeGani, A., Martin, G., Stead, G., & Wade, F. (2010). E-learning standards for an m-learning world - informing the development of e-learning standards for the mobile web. Research in Learning Technologies, 25(3), 181-185. Retrieved from http:// www.researchinlearningtechnology.net/index. php/rlt/article/view/19153 Demouy, V., Eardley, A., Shrestha, P., & Kukulska-Hulme, A. (2011). The interactive oral assessment (IOA) project: Using Talkback® for practice and assessment of listening and speaking skills in languages. In ICL 2011 Interactive Collaborative Learning (pp. 126-129). Piest'any, Slovakia. Retrieved from http://oro.open.ac.uk Driver, P. (2012). Pervasive games and mobile technologies for embodied language learning. International Journal of Computer-Assisted Language Learning and Teaching, 2(4), 23-37. Gilbert, A. (2005). New for back-to-school: Clickers. CNET News. Retrieved from http:// news.cnet.com/New-for-back-to-school-Clickers/2100-1041_3-5819171.html Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81112.

Kahn, P., & O'Rourke, K. (2004). Guide to curriculum design: Enquiry-based learning. Retrieved from http://www.campus.manchester.ac.uk/ceebl/ resources/guides/kahn_2004.pdf Kearney, M., Schuck, S., Burden, K., & Aubusson, P. (2012). Viewing mobile learning from a pedagogical perspective. Research in Learning Technology Journal, 20(1), 21-34.

Lan, Y-J., Sung, Y-T., & Chang, K-E. (2007). A mobile-device-supported peer-assisted learning system for collaborative early EFL reading. Language Learning & Technology, 11(3), 130-151. Retrieved from http:// llt.msu.edu/vol11num3/pdf/lansungchang.pdf Laurillard, D. (2007). Pedagogical forms of mobile learning: framing research questions. In N. Pachler (Ed.), Mobile learning: Towards a research agenda (pp. 153-176). London, UK: WLE Centre, Institute of Education.

Lopes, R. F., & Cortes, O. A. C. (2007). An ubiquitous testing system for m-learning environments. In Second International Conference on Systems and Networks Communications (pp. 25-31). Cap Esterel, France: ICSNC. doi: 10.1109/ICSNC.2007.20 Martyn, M. (2007). Clickers in the classroom: An active learning approach. EDUCAUSE Quarterly, 2, 71-74. Retrieved from http://net.educause.edu/ir/library/ pdf/EQM0729.pdf Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108, 1017-1054.

Murphy, P., Bollen, D., & Langdon, C. (2012). Mobile technology, collaborative reading, and elaborative feedback. In J. Díaz-Vera (Ed.), Left to my own devices: Learner autonomy and mobile-assisted language learning innovation and leadership in English language teaching (pp. 131-159). Bingley, UK: Emerald Group Publishing Limited. Retrievable from http://www.emeraldinsight.com. doi: 10.1163/9781780526478_008 Oberg, A., & Daniels, P. (2013). Analysis of the effect a student-centred mobile learning instructional method has on language acquisition. Computer Assisted Language Learning, 26(2), 177-196. Retrieved from http://www.tandfonline.com Pemberton, L., Winter, M., & Fallahkhair, S. (2010). Collaborative mobile knowledge sharing for language learners. Journal of the Research Center for Educational Technology, 6(1), 144-148. Retrieved from http://www.rcetj.org Petersen, S., & Markiewicz, J-K. (2008). PALLAS: Personalized language learning on mobile devices. In Proceedings 5th IEEE International Conference on Wireless, Mobile and Ubiquitous Technology in Education (pp. 52-59). Los Alamitos, CA: IEEE Computer Society. Retrieved from http://www.idi. ntnu.no

Puentedura, R. R. (2011). A brief introduction to TPCK and SAMR. R. [Blog post]. Retrieved from http:// hippasus.com/rrweblog/archives/2011/12/08/ BriefIntroTPCKSAMR. pdf Rubner, G. (2012). MbClick: An electronic voting system that returns individual feedback. Retrieved from

http://www.heacademy.ac.uk/assets/documents/ stem-conference/gees/Geoff_Rubner.pdf

Sambel, K. (2010). Enquiry-based learning and formative assessment environments: student perspectives 2010. Practitioner Research in Higher Education, 4(1), 52-61.

Seipold, J. (2011). A critical perspective on mobile learning: Results of a heuristic analysis of the scientific process and a hermeneutic analysis of mobile learning practice. In K. Rummler, J. Seipold, E. Lubcke, N. Pachler & G. Attwell (Eds.), Book of abstracts of the conference: Mobile learning: Crossing Boundaries in Convergent Environments (pp. 31-34). Bremen, Germany: ML. Retrieved from https://www. researchgate.net/publication/266082483_Mobile_ learning_Crossing_boundaries_in_convergent_ environments_%27Mobile_learning_Crossing_ boundaries_in_convergent_environments%27_ conference_Book_of_abstracts

Sharples, M., Taylor, J., & Vavoula, G. (2007). A theory of learning for the mobile age. In R. Andrews & C. Haythornthwaite (Eds.), The SAGE handbook of e-learning research (pp. 221-224.). London, UK: Sage.

Stockwell, G., & Hubbard, P. (2013). Some emerging principles for mobile-assisted language learning. Monterey, CA: The International Research Foundation for English Language Education. Retrieved from http://www.tirfonline.org/english-in-the-workforce/mobile-assisted-language-learning

Voelkel, S., & Bennett, D. (2013). Combining the formative with the summative: The development of a two-stage online test to encourage engagement and provide personal feedback in large classes. Research in Learning Technology, 21(1), 75-92. Retrieved from http://www.researchinlearningtechnology.net/ index.php/rlt/article/view/19153

i Надоели баннеры? Вы всегда можете отключить рекламу.