Научная статья на тему 'The Likelihood of Cheating at Formative Vocabulary Tests: Before and During Online Remote Learning in English Courses'

The Likelihood of Cheating at Formative Vocabulary Tests: Before and During Online Remote Learning in English Courses Текст научной статьи по специальности «Науки об образовании»

CC BY-ND
9
7
i Надоели баннеры? Вы всегда можете отключить рекламу.
Журнал
Journal of Language and Education
WOS
Scopus
ВАК
Область наук
Ключевые слова
cheating / online remote test / online remote learning / English course

Аннотация научной статьи по наукам об образовании, автор научной работы — Budi Waluyo, Nur Lailatur Rofiah

Introduction: Early review studies identified the prevalence of cheating and the emergence of various forms of cheating in academic institutions. Now, there is growing concern about the rise of academic dishonesty in an unproctored online test environment that is conducted remotely. Purpose: This study examined the likelihood of student cheating at formative vocabulary tests that were conducted before and during online remote learning in English courses. The vocabulary tests were administered using the Socrative application in both learning conditions. Method: Using a quantitative research design, including Multiple paired-sample t-tests and independent t-tests, this study collected 2971 firstand second-year students’ formative scores across six general English courses. Results: Multiple paired-sample t-tests confirmed that students’ scores were significantly higher during online remote learning, with score differences ranging from 0.10 to 2.21 between before and during online remote learning. This difference in score patterns indicated the likelihood of students cheating during online remote learning. Then, independent t-tests did not reveal the tendency that male students are more likely to cheat on online tests more often than female students. Conclusion: The findings of this study may serve as an initial phase of inquiries into the identification of formative test cheating in online English classes.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «The Likelihood of Cheating at Formative Vocabulary Tests: Before and During Online Remote Learning in English Courses»

//dior9/10-17323/jle-2024-14037 The Likelihood of Cheating at Formative

Vocabulary Tests: Before and During Online Remote Learning in English Courses

Budi Waluyo , Nur Lailatur Rofiah

Walailak University, Thailand

ABSTRACT

Introduction: Early review studies identified the prevalence of cheating and the emergence of various forms of cheating in academic institutions. Now, there is growing concern about the rise of academic dishonesty in an unproctored online test environment that is conducted remotely. Purpose: This study examined the likelihood of student cheating at formative vocabulary tests that were conducted before and during online remote learning in English courses. The vocabulary tests were administered using the Socrative application in both learning conditions. Method: Using a quantitative research design, including Multiple paired-sample t-tests and independent t-tests, this study collected 2971 first- and second-year students' formative scores across six general English courses.

Results: Multiple paired-sample t-tests confirmed that students' scores were significantly higher during online remote learning, with score differences ranging from 0.10 to 2.21 between before and during online remote learning. This difference in score patterns indicated the likelihood of students cheating during online remote learning. Then, independent t-tests did not reveal the tendency that male students are more likely to cheat on online tests more often than female students.

Conclusion: The findings of this study may serve as an initial phase of inquiries into the identification of formative test cheating in online English classes.

KEYWORDS

cheating, online remote test, online remote learning, English course

INTRODUCTION

Citation: Waluyo B., & Nur Lailatur Rofiah. (2024). The likelihood of cheating at formative vocabulary tests: Before and during online remote learning in english courses. Journal of Language and Education, 10(1), 133-145. https://doi.org/10.17323/jle.2024.14037

Correspondence:

Nur Lailatur Rofiah, e-mail: [email protected]

Received: March 21, 2022 Accepted: March 15, 2024 Published: March 30, 2024

Due to the emergence of COVID-19 at the beginning of 2020, a sudden shift from face-to-face to online classes has revealed several issues in pedagogical practices. The growth in student cheating on online remote exams and formative tests is one of them. An early review study identified the prevalence of cheating and the emergence of various forms of cheating in academic institutions (McCabe et al. 2001), and now that higher education institutions are forced to organize online remote exams, there is growing concern about the rise of academic dishonesty in an unproctored online test environment. Long before the outbreak, empirical research on student cheating predicted that due to a lack of

face-to-face contact between student and teacher, online remote cheating would be more prevalent than traditional forms of cheating (e.g., Fontaine, 2012; McNabb & Olmstead, 2009). An increased amount of research has recently attempted to collect evidence of student cheating (Bilen & Matros, 2021; Vellanki et al., 2023), developed proctoring strategies (Nguyen et al., 2020), and searched for appropriate assessment designs (Raje & Stitzel, 2020) in examinations held during COVID-19 online remote classes. Meccawy et al. (2021) gathered students' and lecturers' perspectives on the implementation of online remote tests during COVID-19's period; both students and lecturers expressed concerns about the increase in cheating and plagiarism and urged the university to

raise student awareness and ethics, train lecturers to detect cheating methods, and impose severe sanctions on those who engage in such practices.

Despite the noted challenges around maintaining high standards of academic integrity in assessment during the COVID-19 period there are relatively few studies that specifically examine the identification of student cheating on tests in online remote English classes at the university level among Asian EFL students. The existing literature has predominantly concentrated on identifying cheating as a key area of concern. This emphasis is crucial as the insights gained can guide the development of effective designs for online assessments, ultimately minimizing instances of cheating (Arnold, 2016). As a response, the current study attempts to identify cheating at formative vocabulary tests that were conducted before and during COVID-19's online remote learning among university students in Thailand. The Thai government issued a national emergency decree on March 26, 2020, requiring Thai universities, including the site of this research, which was in the midst of the academic year 2019-2020, to transition from face-to-face to fully synchronous online remote learning on April 2, 2020, moving forward (Rofiah et al., 2022). This instruction took place in the middle of the academic term, which meant that students and lecturers had completed half of the academic term in class before the instruction, and students and lecturers experienced online remote classes for the remaining weeks. This study was able to collect data on students' vocabulary test results before and throughout the COVID-19 outbreak's online remote learning.

Three approaches have been used in prior studies to detect the likelihood of cheating. The first approach is to collect students' perceptions using scenarios designed to elicit students' personal perspectives on whether they would cheat on online tests (e.g., Daniels et al., 2021; Walsh et al., 2021). This method may include self-reported surveys or qualitative interviews to ascertain whether students cheated on online tests in previous terms (Janke et al., 2021). The second approach is to compare students' test scores in offline and online environments (e.g., Brallier & Palm, 2015; Chuang et al., 2017; Ranger et al., 2020). The last approach assesses the likelihood of student cheating by examining the grade patterns of students (Arnold, 2016).

The current study takes the second and third approaches, i.e., comparing students' test scores in offline and online remote contexts and observing any unusual grade patterns that may indicate the likelihood of cheating during formative vocabulary tests. The use of the internet and technology, combined with the remote distance between students and teachers, appears to have enhanced the temptation to cheat. The findings of this study examine such an assumption and deepen our understanding of the disproportions in student test performance prior to and during COVID-19's online remote learning.

This study is built upon the approaches of student cheating on online remote tests. Following that, the literature review section below reviews studies on cheating on online remote tests in the context of online remote learning both before and during COVID-19's online remote learning. It continues with a discussion about EFL's teachers' concerns over the reliability and validity of assessments during online learning due to the high possibility of student cheating. Then, it brings up the practice of using technology-based approaches for formative assessment in online learning and the role of gender among students who cheat on a test. Thus, the following research problems are addressed using empirical data:

(1) Was there a significant difference in student performance on vocabulary tests undertaken before the commencement of COVID-19 related online learning and those undertaken during COVID-19 related online learning?

(2) How do the performances of female and male students compare in vocabulary tests?

LITERATURE REVIEW

Cheating Practices in Higher Education

Cheating on a test is described as a violation of the regulations that have been established for a specific test and have been explicitly laid out for students (Dick et al., 2003). Most test rules include the prohibition of copying classmates' answers, the prohibition of opening learning materials sources, such as books and modules, the prohibition of seeking answers from reachable people, such as classmates and teachers, the prohibition of using digital device aids that can assist in finding test answers, and so forth, all of which essentially require students to concentrate on answering test questions using their own knowledge without the assistance of outsiders. Cheating has long been a problem in educational assessments, as cheating is often perceived as a quick way to earn a decent grade (Aiken, 1991). The scale of the problem is demonstrated in an early study by McCa-be et al. (2001), who conducted a review of the studies on cheating over the last 30 years and reported that students' impressions of their peers' behavior were the most powerful influence on their inclination to cheat. While it is true that not all students cheat, they are inclined to do so if they witness classmates cheating on tests. Moreover, tests that are seen as difficult learning tasks will have a substantial, direct impact on students' likelihoods of cheating, as they might generate negative emotions, such as anxiety and stress, as well as increased pressure prior to the tests (Wenzel & Reinhard, 2020).

Cheating on tests becomes more of a concern in online remote learning contexts. One of the primary reasons is that

proctors are unable to supervise students completely during online remote testing, which results in increased potential for students to cheat. Even though there are a number of proctoring approaches that enable relatively secure online testing environments to be established, for example ProctorU or Proctorexam.com, this type of affordance was not available to the institution in question. Fask et al. (2014), for example, studied students' test-taking behaviors in offline and online environments. Their findings revealed that the online testing environment has a detrimental effect on performance, including increased ambient distractions, differences in student comfort, differences in technical difficulties, and differences in the ability to seek clarifications for potentially ambiguous exam questions. All these negative consequences encourage students to cheat, meaning that online testing aids student cheating, a conclusion reinforced by further research (e.g., Chuang et al., 2017; et al., 2020). When online remote examinations are not proc-tored, students are more likely to cheat (Harmon & Lambri-nos, 2008). It has been recognized that students perform much better on unproctored online remote tests than on proctored classroom assessments, raising the possibility of cheating (Brallier & Palm, 2015; Waluyo & Tuan, 2021). Thus, to combat academic dishonesty in online testing, previous research has emphasized the importance of 1) tightening the proctoring process using webcam recording software, which can be useful during tests and for post-test evaluation (Dendir & Maxwell, 2020), and 2) using paraphrased test questions whose answers are not readily available on the internet (Golden & Kohlbeck, 2020). Cheating, nevertheless, may not be completely eliminated due to the nature of online remote testing. However, an empirical study conducted by Ladyshewsky (2015) discovered no statistically significant differences in students' test scores on supervised in-class tests and unsupervised online tests among post-graduate students, even though both types of tests included multiple-choice questions that are prone to cheating. These findings suggest that the higher the educational level at which students study, the less likely they are to cheat, regardless of the testing situations.

Online Remote Learning and Test during COVID-19

In March 2020, many higher education institutions worldwide transitioned from face-to-face learning to online learning in response to the COVID-19 pandemic. These significant shifts occurred spontaneously and without prior planning but were critical to reduce contact between students and teachers and to contain the spread of the COVID-19 virus. Since then, educators have encountered numerous barriers and challenges, raising concerns about COVID-19's online remote learning's effectiveness as a substitute for traditional teaching and learning. One of the points of contention is whether the new norm of online learning makes it easier for students to cheat. As a result, a growing number of empirical research has been conducted on the subject in

different countries. Janke et al. (2021) conducted a survey in Germany to determine the dangers of ad hoc online assessment for academic integrity. They surveyed 1608 German students from various higher education institutions who had participated in COVID-19's online remote learning. As expected, their investigation found students' accounts of frequent cheating on tests and exams when enrolled in online learning. Similar findings have been found through empirical studies involving students from a variety of countries, including Bangladesh, Canada (Daniels et al., 2021), and the United States of America (Walsh et al., 2021), but little is known about Thailand. Among the key factors that contribute to students cheating on tests during COVID-19's online remote learning are stress and anxiety related to COVID-19's circumstances (Apridayani et al., 2023). Negative emotions impair one's ability to focus on learning. Moreover, both university lecturers and students acknowledged that online remote learning makes it easier for students to cheat due to the lack of supervision (Reedy et al., 2021). The findings from these latest studies on student cheating on tests during COVID-19's online remote learning corroborate the conclusions from previous studies on online test cheating.

Concerns regarding the reliability and validity of formative and summative tests delivered during COVID-19's emergency teaching have also been voiced by EFL teachers. In fact, Ghanbari and Nowroozi's qualitative study (2021) revealed that EFL teachers saw cheating as a key problem and concentrated their efforts on reducing the likelihood of student cheating on online tests. Test results, particularly those from formative assessments, can be utilized to track student progress and serve as a benchmark for continuous improvement of student learning throughout the course. Cheating can skew test results by failing to reflect students' actual knowledge and skills, thereby misleading teachers with the following teaching and learning materials. More crucially, a study by Shoaib and Zahran (2021) discovered that weak students viewed the COVID-19's online remote learning as an opportunity to obtain better grades through cheating. In this case, teachers would have a difficult time identifying weak students and providing suitable interventions to aid in their learning. In other instances, high performers who do not cheat on online remote examinations but receive lower results are deemed weak and receive further learning treatments. These circumstances may result in unconscious misinterpretations of students' English learning progress. Unfortunately, empirical evidence for the subject is still lacking, and research into student cheating on online remote assessments, particularly in the present online remote learning practice, has not been thoroughly investigated in online remote English classes. Moorhouse and Kohnke (2021) conducted a review of articles concerning online English classes during the COVID-19 pandemic, and their findings included no mention of the ELT community's identification of student cheating in online tests as a reaction. Thus, the current study intends to address the research gap at this point.

It is critical to highlight that EFL teachers continue to undertake summative and formative assessments in their online English classes, with some adapting assessment plans to meet the online environment and others maintaining the same assessment plans as in face-to-face learning (Zhang et al., 2021; Waluyo, 2020). Between the two, formative assessment is more likely to be compromised by student cheating on formative tests because of its iterative nature throughout the learning process. The results will not assist teachers in identifying students' deficiencies, nor will they assist students in making greater overall academic progress, as Arnold (2016) suggested after examining students' scores on online formative tests at a Dutch university. The study substantiated instances of cheating in online tests by identifying irregular grade patterns that exhibited a negative correlation with students' academic progress. Throughout the pandemic era, the ELT professional community has been actively engaged in the development of process-oriented and formative assessment practices (Chung & Choi, 2021). Online formative assessments have been suggested to be critical in connecting assessment, teaching, and learning because they enable teachers to identify students' weaknesses during the learning process, provide appropriate feedback for students' learning improvement, and direct teachers' subsequent teaching approaches toward student learning enhancement (Gikandi et al., 2011). Yet, this type of assessment may be ineffective unless efforts are made to identify and resolve student cheating on online tests.

COVID-19's online learning has also been considered as an opportunity to apply technology-based formative assessments (Prastikawati, 2021; Waluyo & Apridayani, 2021). One of the practices is the deployment of online applications that incorporate IRS (Interactive Response Systems), which ena-

bles teachers to identify students' strengths and weaknesses in real-time. Students can also observe and track their formative test outcomes. Socrative is one of the several IRS-based educational apps that applied in the online teaching and learning space. Students who took tests in an online class that utilized Socrative for formative assessment were pleased with the results since they arrived promptly and simply (Abdulla et al., 2021), and teachers maintained some continuity and active learning in the classroom despite being in a different location (Christianson, 2020). Teachers can develop multiple-choice, true/false, and short-answer questions using Socrative. Teachers can use a variety of delivery methods and settings when presenting the app as a formative test. Teachers can select Instant Feedback, which provides quick feedback to students once they respond to a test question. Teachers can choose Open Navigation, which empowers students to answer questions based on their choices in a random fashion. Also, there is the Teacher Pace option, which allows teachers to manage the flow of questions and monitor responses as they occur. All of these activities take place in real-time and are accessible through smartphones, laptops, and computers. Nonetheless, Rofiah and Waluyo's quantitative study (2020) highlighted Thai EFL students' approval of Socrative as a means for administering vocabulary formative tests, as well as the risk of student cheating during exams. Their research examined the use of Socrative for formative assessment in the classroom. It is reasonable to assume that the possibility of cheating will be greater when the app is used in online exam environments. Nonetheless, actual evidence for this is still sparse, which the current study will explore.

Meanwhile, by gender, significant differences will likely be noticeable when female and male students vary in their

Figure 1

Steps to Launch a Quiz on Socrative.com

levels of self-control, shame, perceived external sanctions, grades, and cheating intentions (Tibbetts, 1999). Given that gender serves as both a control variable (Finn & Frone, 2004) and a personal factor influencing cheating behavior (McCabe & Trevino, 1993), exploring gender differences is pivotal in understanding the motivations behind students reporting suspected academic dishonesty. In alignment with this perspective, Simon et al.'s study (2004) substantiated the relevance of gender in this context by uncovering a substantial contrast between male and female students. Their findings emphasized that female students, in particular, displayed a significantly greater inclination to report suspected instances of academic dishonesty, shedding light on the intricate interplay between gender and reporting behavior in academic integrity matters. Previous studies found that male students cheat more frequently or have a higher perception of cheating than female classmates (Muntada et al., 2013; Zhang et al., 2018). Gender disparities in online test cheating, on the other hand, have not been sufficiently investigated.

METHOD

Research Design

The primary objective of this study was to identify the likelihood of student cheating at formative vocabulary tests that were conducted before and during online remote learning. To achieve this objective, it employed a quantitative research design with an emphasis on examining substantial disparities in student performance between in-class vocabulary tests before and online remote vocabulary tests during COVID-19's online remote learning. The vocabulary tests were administered using the Socrative application in both learning modes. This study tracked students' vocabulary test scores across six general English courses, involving students from different cohorts and academic majors, occurring prior to and during the emergency online learning at a university in the south of Thailand.

Setting

This study was conducted in the context of six mandatory General English (GE) courses that began on February 10, 2020, and ended on May 1, 2020, during the third academic term of 2019-2020. It involved 2971 first- and second-year students studying various academic majors. The students were spread out studying six different English courses. The detailed descriptions of the courses and the number of students involved are elaborated below and summarized in Table 1.

Course 1

The first English course was GE61-122, entitled «Academic Listening and Speaking,» and was taken by 1st-year stu-

dents. 387 students enrolled in total. The courses place an emphasis on English proficiency practice in both informal and formal settings. Through dialogues, passages, reports, and announcements, it focuses on listening and pronunciation. Moreover, through group discussions, oral presentations, and report writing, it aims to develop academic speaking skills.

Course 2

The second English course was GEN61-123, «Academic Reading and Writing,» which was studied by first-year students. There were 1171 students in all. This course is primarily designed to help students improve their reading and writing skills through a variety of academic texts and exercises. It specifically strengthens students' abilities to conduct critical readings of academic publications, summarize key concepts from texts, create various types of academic reports, compose effective paragraphs and essays, and appropriately use citations and references throughout the writing process.

Course 3

The third English course, GEN61-124, was taken by second-year students and was named «English for Academic Communication.» There were 156 students in all. This course aims to improve students' understanding of the English language and their ability to communicate effectively in academic and professional settings. It equips students with the necessary communication methods and abilities for academic correspondence. Moreover, it teaches students how to properly recognize their sources, which results in more effective academic communication.

Course 4

The fourth English course, GE61-127, was taken by second-year students and was entitled «English for Presentation in Sciences and Technology.» There were 150 students in all. This course focuses on the four key English abilities of listening, speaking, reading, and writing, with an emphasis on scientific phrases, structures, and terminology. Further, it instills in students the required abilities for effective presentation.

Course 5

The fifth English course was GEN61-128, «English for Humanities and Social Sciences Presentation,» which was taken by second-year students. There was a total of 76 students. This course is aimed to teach students how to plan, organize, and deliver excellent presentations while focusing on the content, structure, and delivery. It emphasizes several facets of oral presentations, such as pronunciation, volume, intonation, body language, gestures, and images.

Course 6

The sixth English course, GEN61-129, was taken by second-year students and was titled «English for Media and Communication.» There were 76 students in total. This course is aimed to help students improve their English communication abilities by utilizing a variety of artistic and communicative media. These include teleconferencing, conducting interviews, producing simple news stories, developing engaging commercials, writing scripts for blog sites, voice recording and pronunciation techniques, using a tel-eprompter, and speaking from a script. It builds students' confidence in their English speaking and communicative abilities.

Course Design and Data

Each of the six courses that the students were studying implemented weekly formative vocabulary tests that lasted 10

Table 1

Students' Data

weeks. Students were obliged to study fifty academic English words from provided lists each week in these courses. Following that, either in week 2 or week 3, students' vocabulary knowledge was assessed during the first ten minutes of class before the main lessons took place. One test lasted ten minutes and consisted of fifteen multiple-choice questions. Students completed ten vocabulary tests using Socra-tive.com over the course of ten weeks. Students accessed the tests using their smartphones, which were proctored by teachers in the classroom. When the COVID-19 outbreak occurred, the students were in the middle of the academic term. Therefore, the students took half of the formative tests in-class and the other half online. Researchers tracked students' vocabulary scores in the selected courses. The data was cleaned up, including the removal of incomplete test scores. As presented in Table 1, there were 2971 students' scores that were kept for further analysis. Below is the sample formative vocabulary test administered through the Socrative application.

_ Course CEFR Students' Course Level Year of Study Gender N No. Quizzes No. Quizzes on-in Class line remotely

Male Female

GEN61-122 English for Academic Listening and Speaking

GEN61-123 English for Academic Reading and Writing GEN61-124 English for Academic Communication

GEN61-127 English for Presentation in Sciences and Technology

GEN61-128 English for Presentation in Humanities and Social Sciences

GEN61-129 English for Media Communication

A1-A2 A1-A2 A2-B1 A2-B1 A2-B1

Upper B1

1st Year 1st Year 2nd Year 2nd Year 2nd Year

2nd Year

126 276 49 22

27

274

261 895 107 128

49

757

387 1171 156 150

76

1031

4

6

5

5

Figure 2

Sample Vocabulary Tests on Socrative

Target Words of the Formative Vocabulary Tests

Each of the six selected courses had a target vocabulary of 500 academic English words ranging from A1 to B1 on the CEFR (Common European Framework of Reference). The words were divided into ten lists that students were required to study independently at home. Students were assigned to write definitions and sample sentences for each word in the vocabulary lists provided. For one week, one list was studied. It was expected that this technique would enable students to acquire vocabulary on their own. In their independent vocabulary learning, students were encouraged to make the most use of any available resources, such as dictionary, Google Translate, etc. Students could also consult the words with teachers through Facebook if they wished to do so.

Procedure

The research procedures consisted of two phases. In the first phase, students did the formative vocabulary tests in class. At that time, the teaching and learning process was normal and COVID-19 outbreak had not reached the area. This occurred from February 10 to April 1, 2020. Then, the second phase was the time when students took the formative vocabulary tests online remotely. The COVID-19 outbreak had reached the area. As a response, the university moved all English classes remotely online from April 2, 2020, to the end of the term on May 1, 2020. Except the mode of learning, all vocabulary test procedures were kept the same

Table 2

Sample Target Words in Each Course

as in the first phase. Table 1 shows the number of formative vocabulary tests that students took in-class and remotely online. Figure 3 is the illustration sample of the data collection procedure. All the courses had an equal number of tests in class and online except for course 1 and course 4 as shown in table 1.

Below are the procedures carried out in each research phase:

First Phase: In-Class Formative Vocabulary Tests

Each course prepared 500 target words, divided into ten vocabulary sets, prior to the start of the term. Each set had fifty words that students were required to study weekly, beginning in week two or three, depending on the course's lesson schedule. A ten-minute test comprised of fifteen multiple-choice questions. The test inquired about the meaning of words, their parts of speech, synonyms, and antonyms, as well as sentence completion. Students completed the test by accessing Socrative.com via their smartphones. The teacher could monitor student progress from the classroom computer, display it on the projection screen, and roam around the room to prevent students from cheating.

Second Phase: Online Formative Vocabulary Tests Remotely

Due to the COVID-19 pandemic, the Thai government issued a national emergency decree on March 26, requiring Thai

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Course CEFR Level Example of Target Words

GE61-122 English for Academic Listening and Writing A1-A2 Quite (adv), Suppose (v), Dish (n), Important (adj), Train

station (n)

GE61-123 English for Academic Reading and Writing A1-A2 activity (n), answer (n), describe (v), text (n), different (adj)

GE61-124 English for Academic Communication A2-B1 History (n), according (adv), communication (n)

compare (v), effect (n)

GE61-127 English for Presentation in Sciences and A2-B1 beginning (n), successful (adj), audience (n), divide (v),

Technology Presentation (n)

GE61-128 English for Presentation in Humanities and A2-B1 absolutely (adv), accompany (v), account (n), accurate (adj),

Social Sciences although (conj)

GE61-129 English for Media Communication B1 Attract (v), Brochure (n), Buyer (n), Celebrity (n), Goods (n)

Figure 3

Data Collection Procedure

Data collection procedure

Phase 1. in-class

Phase 2. Remotely online

Week 3 Quiz 2

Week 4 Quiz 3

Week 5 Quiz 4

Week 6 Quiz 5

Week 10 Quiz 9

universities, including the site of this research, which was in the midst of the academic year 2019-2020, to transition from face-to-face to fully synchronous online learning on April 2, 2020. Teachers conducted lessons using a variety of conferencing platforms, including Zoom, Webex, and Ms. Team. When teachers administered vocabulary tests, they were able to track students' progress solely through their personal computers. They were unable to effectively supervise the test due to several constraints, including the limited number of students per monitor, a lack of equipment, unfa-miliarity with online teaching, and time management. These limits created an environment conducive to disobedience and cheating during the test.

Data Analysis

This study used IBM SPSS 25 for data analysis. Following data collection, they were cleaned, computed in SPSS, and prepared for analysis. Incomplete scores from absent students were not considered for the data analysis. Only completed students' score from test one to ten were all included in the data analysis from all courses. To answer the first research question, multiple paired-sample t-tests were performed and independent t-tests were used to examine the second research question. Then, multiple t-tests were performed on students' formative vocabulary in-class and online test scores separately for each course.

RESULTS

Student Performance on Vocabulary Tests before and during Online Remote Learning

To answer the first research question, multiple paired-sample t-tests were performed. The results showed significant differences between students' formative vocabulary tests before and during COVID-19's online learning in five courses while no significant differences were observed in one course. The analysis results emphasized that students' scores were significantly higher during COVID-19's online learning. Out of 15, the means of students' scores increased from 9.33 to 11.54 (Course 1: d = 2.21, SD = 1.11, P < 0.01), from 11.7 to 12.70 (Course 2: d = 1, SD = 1.05, P < 0.01 ), from 11.52 to 12.15 (Course 3: d = 0.63 , SD = 1.85, P < 0.01)), from 11.81 to 12.04 (Course 4: d = 0.23 , SD = 1.69, P < 0.05)), and from 10.85 to 10.95 (Course 6: d = 0.10, SD = 1.11, P < 0.05)), except for Course 5 (d = 0.23 , SD = 1.61, P =.208). Among these, Course 2 had the highest effect size (Cohen's d = .8), while small effect sizes were noted in Course 3 (Cohen's d = .3), 4 (Cohen's d = .1), and 5 (Cohen's d = .1), and very small effect sizes were obtained from Course 1 (Cohen's d = .02) and 6 (Cohen's d = .06). All the SD values in each course were greater than 1.0, signifying that there were high dispersions among students' test results both in offline and online tests.

Table 3 exhibits the detailed results for each course. Chart 1 illustrates the differences in students' scores.

Comparison of Vocabulary Test Performance between Male and Female Students

Independent t-tests were used to examine the second research question. Separately for each course, multiple t-tests were performed on students' formative vocabulary in-class and online test scores. The results indicated that there were no significant differences in mean scores on both in-class and online tests between male and female students in all courses except Course 5. For example, in course 1, the scores of male and female students before (Male = 9.19, Female = 9.41) and during COVID-19 (Male = 11.50, Female = 11.57) all out of 15 were nonsignificant. The case was similar for course 2, course 3, course 4 and course 6. The difference of scores between male and female students were nonsignificant. However, Male students outperformed female students in Course 5 in-class tests (t (74) = 2.04, p =.045) with a medium effect size (Cohen's d = (10.39 - 11.31) 1.92 = .5) and online tests (t (74) = 2.50, p =.015) with a medium effect size (Cohen's d = (10.61 - 11.54) 1.63 = .6). This trend paralleled the findings from the first research question, where Course 5 was singled out differently. Table 4 demonstrates the detailed results.

DISCUSSION

This study identified the likelihood of student cheating at formative vocabulary tests that were conducted before and during online remote learning. It adopted the approaches employed by previous studies: comparing students' test score results in offline and online settings (e.g., Brallier & Palm, 2015; Chuang et al., 2017; Ranger et al., 2020) and exploring students' grade patterns (Arnold, 2016). The first analysis results showed that students' test scores experienced significant increases when the tests were moved to online remote settings in five courses. The significant increase was not statistically visible in one course, i.e., Course 5. The course had the smallest sample size compared to other courses. The descriptive patterns of the students' scores, as shown in Chart 1, confirmed that they achieved greater scores during online remote learning than they did during previous face-to-face learning. Thus, these findings corroborate previous studies indicating that students perform better on tests in online remote contexts (Chuang et al., 2017; Fask et al., 2014; Ranger et al., 2020). Given the notable increments observed across the majority of the chosen courses, this study aligns with Arnold's (2016) findings, suggesting the possibility that instances of student irregularities in online formative tests might have taken place. These occurrences, if indeed present, could have a discernible impact on students' formative scores.

Table 3

Results of T-tests across the Six Courses

Courses Means/SD t/p-value Cohen's d Effect Size*

In-class (Before COVID-19) Online (During COVID-19)

1 9.33/1.50 11.54/1.11 -32.21, p < .001 .02 (Very Small)

2 11.7/1.33 12.70/1.05 -28.18, p < .001 .8 (Large)

3 11.52/1.88 12.15/1.85 -6.46, p < .001 .3 (Small)

4 11.81/1.68 12.04/1.69 -2.33, p = .021 .1 (Small)

5 10.71/1.91 10.94/1.61 -1.270, p = .208 .1 (Small)

6 10.85/1.66 10.95/1.85 -2.03, p = .042 .06 (Very Small)

Note. *Based on Plonsky and Oswald (2014) Cohen's d effect size Chart 1

Illustration of the Students' Scores In-Class and Online Remote Learning

13 -12,70

Course 1 Course 2 Course3 Couse4 Course 5 Course 6

Online In-Class

Table 4

Results for Cohen's d Effect Size across the Six Courses

Means/SD

t/p-value Cohen's d Effect Size

Course In-class (Before COVID-19) Online (During COVID-19)

Male Female Male Female In-Class Online

1 9.19/1.36 9.41/1.56 11.50/1.13 11.57/1.11 1.307, p = .19 .2 (Small) .06 (Very Small)

2 11.7/1.34 11.74/1.33 12.70/1.15 12.71/1.03 .454, p = .65 .03 (Very Small) .01 (Very Small)

3 11.31/1.86 11.62/1.70 12.05/2.01 12.21/1.79 .947, p = .35 .2 (Small) .4 (Small)

4 11.97/201 11.79/1.63 12.15/2.31 12.03/1.58 .478, p = .63 .01 (Very Small) .07 (Very Small)

5 11.31/2.05 10.39/1.78 11.54/1.86 10.61/1.37 2.04, p .045 .5 (Medium) .6 (Medium)

6 10.85/1.77 10.86/1.62 11.06/1.93 10.92/1.83 .061, p .95 .01 (Very Small) .07 (Very Small)

Several pedagogical implications emerge from the study's findings. Teachers are urged to exercise caution when receiving formative exam scores from students. Now that the study has established the potential of cheating, test scores may not accurately reflect students' true abilities. Teachers should be aware that weak students perceived the assess-

ments administered during online remote learning as opportunities to cheat their way to a better grade (Shoaib & Zahran, 2021; Taherkhani & Aref, 2024). Teachers are urged to utilize paraphrased test questions for which the solutions are not readily available online (Golden & Kohlbeck, 2020) while tightening the proctoring process through the use of

webcam recording software, which can be beneficial during tests and for post-test evaluation (Dendir & Maxwell, 2020). Moreover, formative evaluation cannot be treated in the same way that it is in face-to-face learning. Indeed, online formative assessments are crucial as a benchmark for differentiating learning aid provided to students throughout the learning process (Gikandi et al., 2011). Nonetheless, teachers must monitor students' behaviour and performance in online remote learning classes. Teachers may wish to ask students who do poorly or well on formative assessments to validate their expected competencies. This type of technique may assist teachers in determining the validity of students' formative test results.

Furthermore, these initial results contribute to our understanding that, while employing technology-based formative assessments appears to be a smart idea, the risk of student cheating has been observed in both online and offline situations. Rofiah and Waluyo (2020) discovered that, although students accepted Socrative.com as a means for conducting vocabulary formative tests, students acknowledged using an online dictionary, chatting with online peers, and browsing the internet during formative tests on Socrative. These activities become even more convenient when teachers and students are located in different locations in online classes. Students can create excuses for not turning on their cameras during COVID-19's online learning, such as poor internet connections or a lack of one. Even when students activate their cameras, teachers' visibility remains limited, particularly in large classes (Koger & Koksal, 2024). Even though in the current study, the online test setting did have teacher proctoring, and while it did not work perfectly, it could be a solution compared to a case with no teacher proctoring at all. As with formative assessment, this study advocates for online tests administered by IRS-based technology such as Socrative to account for a small portion of a student's grade. Due to the decreased impact online test results have on a student's grade, cheating during tests may be minimized.

The following statistical analysis revealed that female and male students fared equally well on formative vocabulary tests prior to and during COVID-19's online learning in five courses, with the exception of Course 5, where a significant difference was observed. Given that all five classes had a larger student population than Course 5, this study will partially corroborate earlier research indicating that male students are more likely to cheat than female students (Muntada et al., 2013; Tibbetts, 1999; Zhang et al., 2018). The current study's findings may indicate that cheating on tests in online environments is different from cheating in an offline one. Based on prior research indicating that online learning not only increases opportunities for cheating (Brallier & Palm, 2015; Harmon & Lambrinos, 2008; Pratiwi & Waluyo, 2022), but also causes a slew of negative emotions

in students, such as stress, anxiety, and worry, especially during the COVID-19 pandemic, and introduces technical difficulties and personal discomforts, this study asserts that students, regardless of gender, will cheat on online formative tests. COVID-19 inherently generates negative emotions and insecurity in students, whether about their personal safety and that of their family, or about their academic performance in terms of grade, causing students to perceive formative tests as difficult, which can result in cheating actions as a temporary and easy solution (Apridayani, 2022; Wenzel & Reinhard, 2020).

CONCLUSION

After assessing students' performance on formative vocabulary tests prior to and throughout online remote classes in English courses, this study concluded that the considerable rise in online remote testing indicates the likelihood of cheating. However, the chance of male students cheating more frequently was not proven, contradicting the findings from offline assessments. Given the study's shortcomings, it is stated that this study would be best suited as a pilot study for attempting to uncover student cheats on formative tests in remote online English classes. If qualitative interviews with students had been done, the study would have garnered further insights. However, due to language barriers and the impact of the COVID-19 outbreak, which restricted their opportunities, prevented the researchers, who were foreigners, from conducting qualitative interviews. As we proceed in online remote English classes, the researchers hope that the study's findings will alert English teachers to the possibility of cheating and how to address the issue.

DECLARATION OF COMPETITING INTEREST

None declared.

AUTHORS' CONTRIBUTION

Budi Waluyo: conceptualization, data curation, formal analysis, investigation, methodology, project administration, resources, software, supervision, validation, visualization, writing - original draft, writing - review & editing.

Nur Lailatur Rofiah: conceptualization, data curation, formal analysis, investigation, methodology, project administration, resources, software, supervision, validation, visualization, writing - original draft, writing - review & editing.

REFERENCES

Abdulla, M. H., Brint, E., & Rae, M. K. (2021). Teaching physiology to medical students in the COVID-19 era with synchronous formative assessments utilizing simultaneous, combined Zoom and Socrative platforms. Scierea Journal of Education, 6(1), 12-32. https://doi.org/10.54647/education88192

Aiken, L. R. (1991). Detecting, understanding, and controlling for cheating on tests. Research in Higher Education, 32(6), 725-736. https://doi.org/10.1007/BF00974740

Arnold, I. J. (2016). Cheating at online formative tests: Does it pay off? The Internet and Higher Education, 29, 98-106. https://doi.org/10.1016/j.iheduc.2016.02.001

Apridayani, A., Han, W., & Waluyo, B. (2023). Understanding students' self-regulated learning and anxiety in online English courses in higher education. Heliyon, 9(6), 1-12. https://doi.org/10.1016Zj.heliyon.2023.e17469

Apridayani, A. (2022). Exploring Thai EFL Students' Self-Regulated Learning (SRL) strategies and English Proficiency. MEXTESOL Journal, 46(1), 1-10.

Bilen, E., & Matros, A. (2021). Online cheating amid COVID-19. Journal of Economic Behavior & Organization, 182, 196-211. https://doi.org/10.1016/jjebo.2020.12.004

Brallier, S., & Palm, L. (2015). Proctored and unproctored test performance. International Journal of Teaching and Learning in Higher Education, 27(2), 221-226.

Chung, S. J., & Choi, L. J. (2021). The development of sustainable assessment during the COVID-19 pandemic: The case of the English language program in South Korea. Sustainability, 13(8), 4499. https://doi.org/10.3390/su13084499

Chuang, C. Y., Craig, S. D., & Femiani, J. (2017). Detecting probable cheating during online assessments based on time delay and head pose. Higher Education Research & Development, 36(6), 1123-1137. https://doi.org/10.1080/07294360.2017.13034 56

Christianson, A. M. (2020). Using Socrative online polls for active learning in the remote classroom. Journal of Chemical Education, 97(9), 2701-2705. https://doi.org/10.1021/acs.jchemed.0c00737

Daniels, L. M., Goegan, L. D., & Parker, P. C. (2021). The impact of COVID-19 triggered changes to instruction and assessment on university students' self-reported motivation, engagement and perceptions. Social Psychology of Education, 24(1), 299318. https://doi.org/10.1007/s11218-021-09612-3

Dendir, S., & Maxwell, R. S. (2020). Cheating in online courses: Evidence from online proctoring. Computers in Human Behavior Reports, 2, 100033. https://doi.org/10.1016/jxhbr.2020.100033.

Dick, M. J., Sheard, J. I., Bareiss, C., Carter, J., Joyce, D., Harding, T., & Laxer, C. (2003). Addressing student cheating: Definitions and solutions. SIGCSE Bulletin Inroads, 35(2), 172-184. https://doi.org/10.1145/782941.783000

Fask, A., Englander, F., & Wang, Z. (2014). Do online exams facilitate cheating? An experiment designed to separate possible cheating from the effect of the online test taking environment. Journal of Academic Ethics, 12(2), 101-112. https://doi.org/10.1007/s10805-014-9207-1

Finn, K. V., & Frone, M. R. (2004). Academic performance and cheating: Moderating role of school identification and self-efficacy. The journal of educational research, 97(3), 115-121. https://doi.org/10.3200/JOER.97.3.115-121

Fontaine, J. (2012). Online classes see cheating go high-tech. Chronicle of Higher Education, 58(38), A1-2.

Ghanbari, N., & Nowroozi, S. (2021). The practice of online assessment in an EFL context amidst COVID-19 pandemic: Views from teachers. Language Testing in Asia, 11(1), 1-18. https://doi.org/10.1186/s40468-021-00143-4

Gikandi, J., Morrow, D., & Davis, N. (2011). Online formative assessment in higher education: A review of the literature. Computers and Education, 57(4), 2333-2351. https://doi.org/10.1016/jxompedu.2011.06.004

Golden, J., & Kohlbeck, M. (2020). Addressing cheating when using test bank questions in online classes. Journal of Accounting Education, 52, 100671. https://doi.org/10.1016/jjaccedu.2020.100671

Harmon, O. R., & Lambrinos, J. (2008). Are online exams an invitation to cheat? The Journal of Economic Education, 39(2), 116-125. https://doi.org/10.3200/JECE.39.2.116-125

Janke, S., Rudert, S. C., Petersen, A., Fritz, T. M., & Daumiller, M. (2021). Cheating in the wake of COVID-19: How dangerous is ad-hoc online testing for academic integrity? Computers and Education Open, 2, 100055. https://doi.org/10.31234/osf.io/6xmzh

Koger, P., & Koksal, D. (2024). An investigation into the online language teaching and assessment practices during COVID-19. International Journal of Educational Spectrum, 6(1), 1-17. https://doi.org/10.47806/ijesacademic

Ladyshewsky, R. K. (2015). Post-graduate student performance in 'supervised in-class' vs.'unsupervised online'multi-ple choice tests: Implications for cheating and test security. Assessment & Evaluation in Higher Education, 40(7), 883-897. https://doi.org/10.1080/02602938.2014.956683

McCabe, D. L., & Trevino, L. K. (1993). Academic dishonesty: Honor codes and other contextual influences. The Journal of Higher Education, 64(5), 522-538. https://doi.org/10.1080/00221546.1993.11778446

McCabe, D. L., Trevino, L. K., & Butterfield, K. D. (2001). Cheating in academic institutions: A decade of research. Ethics & Behavior, 11(3), 219-232. https://doi.org/10.1207/S15327019EB1103_2

McNabb, L., & Olmstead, A. (2009). Communities of integrity in online courses: Faculty member beliefs and strategies. Journal of Online Learning and Teaching, 5(2), 208-223.

Meccawy, Z., Meccawy, M., & Alsobhi, A. (2021). Assessment in 'survival mode': Student and faculty perceptions of online assessment practices in HE during COVID-19 pandemic. International Journal for Educational Integrity, 17(1), 1-24. https://doi.org/10.1007/s40979-021-00083-9

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Moorhouse, B. L., & Kohnke, L. (2021). Responses of the English-language-teaching community to the COVID-19 pandemic. RELC Journal, 52(3), 359-378. https://doi.org/10.1177%2F00336882211053052

Muntada, M. C., Martin, M. D. M. B., i Pros, R. C., & Busquets, C. G. (2013). Academic cheating and gender differences in Barcelona (Spain). Summa Psicologica UST, 10(1), 65-72.

Nguyen, J. G., Keuseman, K. J., & Humston, J. J. (2020). Minimize online cheating for online assessments during COVID-19 pandemic. Journal of Chemical Education, 97(9), 3429-3435. https://doi.org/10.1021/acs.jchemed.0c00790

Pratiwi, D. I., & Waluyo, B. (2022). Integrating task and game-based learning into an online TOEFL preparatory course during the COVID-19 outbreak at two Indonesian higher education institutions. Malaysian Journal of Learning and Instruction, 19(2), 37-67. https://doi.org/10.32890/mjli2022.19.2.2

Plonsky, L., & Oswald, F. L. (2014). How big is "big"? Interpreting effect sizes in L2 research. Language Learning, 64(4), 878-912. https://doi.org/10.1111/lang.12079

Prastikawati, E. F. (2021, July). Pre-service EFL teachers' perception on technology-based formative assessment in their teaching practicum. ELTForum:JournalofEnglish Language Teaching, 10(2),163-171). https://doi.org/10.15294/elt.v10i2.47965

Raje, S., & Stitzel, S. (2020). Strategies for effective assessments while ensuring academic integrity in general chemistry courses during COVID-19. Journal of Chemical Education, 97(9), 3436-3440. https://doi.org/10.1021/acs.jchemed.0c00797

Ranger, J., Schmidt, N., & Wolgast, A. (2020). The detection of cheating on e-exams in higher education — The performance of several old and some new indicators. Frontiers in Psychology, 2390. https://doi.org/10.3389/fpsyg.2020.568825

Reedy, A., Pfitzner, D., Rook, L., & Ellis, L. (2021). Responding to the COVID-19 emergency: Student and academic staff perceptions of academic integrity in the transition to online exams at three Australian universities. International Journal for Educational Integrity, 17(1), 1-32. https://doi.org/10.1007/s40979-021-00075-9

Rofiah, N. L., & Aba, S. A. MYM, & Waluyo, B. (2022). Digital divide and factors affecting English synchronous learning during Covid-19 in Thailand. International Journal of Instruction, 15(1), 633-652. https://doi.org/10.29333/iji.2022.15136a

Rofiah, N. L., & Waluyo, B. (2020). Using Socrative for vocabulary tests: Thai EFL learner acceptance and perceived risk of cheating. Journal of Asia TEFL, 17(3), 966-982. http://dx.doi.org/10.18823/asiatefl.2020.173.14.966

Shoaib, A. M., & Zahran, K. A. (2021). Systematic collective e-cheating in a Saudi Arabian higher education context: A case study. Higher Learning Research Communications, 11(2), 6. https://orcid.org/0000-0002-8649-5758

Simon, C. A., Carr, J. R., McCullough, S. M., Morgan, S. J., Oleson, T., & Ressel, M. (2004). Gender, student perceptions, institutional commitments and academic dishonesty: Who reports in academic dishonesty cases? Assessment & Evaluation in Higher Education, 29(1), 75-90. https://doi.org/10.1080/0260293032000158171

Taherkhani, R., & Aref, S. (2024). Students' online cheating reasons and strategies: EFL teachers' strategies to abolish cheating in online examinations. Journal of Academic Ethics, 1-21. https://doi.org/10.1007/s10805-024-09502-1

Tibbetts, S. G. (1999). Differences between women and men regarding decisions to commit test cheating. Research in Higher Education, 40(3), 323-342. https://doi.org/10.1023/A:1018751100990

Vellanki, S. S., Mond, S., & Khan, Z. K. (2023). Promoting academic integrity in remote/online assessment - EFL teachers' perspectives. TESL-EJ, 26(4), 1-21. https://doi.org/10.55593/ej.26104a7

Waluyo, B. (2020). Thai EFL learners' WTC in English: Effects of ICT support, learning orientation, and cultural perception. Humanities, Arts and Social Sciences Studies, 20(2), 477-514. https://doi.org/10.14456/hasss.2020.18

Waluyo, B., & Apridayani, A. (2021). Teachers' beliefs and classroom practices on the use of video in English language teaching. Studies in English Language and Education, 8(2), 726-744. https://doi.org/10.24815/siele.v8i2.19214

Waluyo, B., & Tuan, D. T. (2021). Understanding help-seeking avoidance among EFL students and the social climate of EFL classrooms in Thailand. Journal of Asia TEFL, 18(3), 800-815. http://dx.doi.org/10.18823/asiatefl.2021.183A800

Walsh, L. L., Lichti, D. A., Zambrano-Varghese, C. M., Borgaonkar, A. D., Sodhi, J. S., Moon, S., Wester, E.R., & Cal-lis-Duehl, K. L. (2021). Why and how science students in the United States think their peers cheat more frequently online: Perspectives during the COVID-19 pandemic. International Journal for Educational Integrity, 17(1), 1-18. https://doi.org/10.1007/s40979-021-00089-3

Wenzel, K., & Reinhard, M. A. (2020). Tests and academic cheating: Do learning tasks influence cheating by way of negative evaluations? Social Psychology of Education, 23(3), 721-753. https://doi.org/10.1007/s11218-020-09556-0

Zhang, C., Yan, X., & Wang, J. (2021). EFL teachers' online assessment practices during the COVID-19 pandemic: Changes and mediating factors. The Asia-Pacific Education Researcher, 30(6), 499-507. https://doi.org/10.1007/s40299-021-00589-3

Zhang, Y., Yin, H., & Zheng, L. (2018). Investigating academic dishonesty among Chinese undergraduate students: does gender matter? Assessment & Evaluation in Higher Education, 43(5), 812-826. https://doi.org/10.1080/02602938.2017.1411467

i Надоели баннеры? Вы всегда можете отключить рекламу.