Teaching in a pandemic: a comparative evaluation of online vs. face-to-face student outcome gains

The COVID-19 pandemic forced the education sector to transform significantly in order to support students across the world. Technology played a crucial role in enhancing and adapting traditional learning to digital resources and networks, which are now an essential component of education. However, there is concern about the quality of teaching and its effectiveness in remote teaching due to the lack of real-life feel of more traditional face-to-face education. Our study analysed two separate groups of students enrolled in the same course but provided with either face-to-face or remote teaching. The results show that there is no statistically significant difference in students’ performance or gain, even for laboratory work and resulting reports. However, there was a statistically significant difference in Turnitin scores between these groups, with the remote students having higher levels of plagiarism compared to the traditional face-to-face students. These results support the theory that remote teaching can be a valid alternative, if not a substitute, to face-to-face teaching in the future. The study’s findings are expected to help instructors who are thinking about providing programs through blended learning in the post-pandemic era.

Similar content being viewed by others

A simple approach of applying blended learning to problem-based learning is feasible, accepted and does not affect evaluation and exam results—a just pre-pandemic randomised controlled mixed-method study

Article Open access 20 October 2022

Comparative Study Between Traditional Learning and Digital Learning Amid the COVID 19 Pandemic

Chapter © 2022

Face-to-face vs. blended learning in higher education: a quantitative analysis of biological science student outcomes

Article Open access 10 January 2024

Explore related subjects

Avoid common mistakes on your manuscript.

1 Introduction

The COVID-19 pandemic has drastically changed the world. One of the sectors that has experienced a significant transformation in higher education [1, 2]. The emergence of the COVID-19 pandemic and the resulting isolation necessitated a drastic change to digitalization in education worldwide [1]. The profound effects of the pandemic on students have not yet been fully realized, but short-term ramifications include an inability to socialize or work in groups, no access to campus-based classrooms or laboratories, and a cessation of ‘normal’ university life [3, 4]. Whilst a wide array of technological advances has given educators the opportunity to engage in non-traditional classroom techniques, the COVID-19 pandemic required educators to learn online pedagogical methods and to quickly adjust to the ‘new normal’ [5]. For most people, the COVID-19 pandemic and ensuing lockdowns were unique experiences, however, permanently changed the nature of how we work and learn, by proving that working and learning from home are viable alternatives to attending classes or meetings in person. Even now, as the threat of the pandemic abates, hybrid teaching and learning has become a necessary tool in pedagogical content knowledge [4]. Online and hybrid pedagogies are not necessarily new concepts, but they are novel in the context of normative university education, and especially with regard to courses that require practical application of conceptual learning, like laboratories [6, 7]. During the lockdown, educators were forced to shift their teaching online, often without any training or understanding of how the delivery of content must change [5, 8, 9]. Of particular concern to science educators in higher education was how laboratory techniques could possibly be taught without hands-on learning. These concerns were additional to traditional issues in teaching science, like the possibility of plagiarism, or a student falling behind in classwork. Therefore, it is crucial to understand whether there are significant differences in conceptual gains between students who attend lecture/laboratory courses in a face-to-face (F2F) format versus those who attend virtually. To address this question, we studied the conceptual gains of F2F and remotely connected students enrolled in the same course by comparing their final exam, lab report and Turnitin scores of all students.

2 Literature review

A variety of technologies have enabled us to enhance and adapt traditional learning approaches with computer-based resources [10]. Educators have been encouraged to develop technology-based learning media capable of meeting school curricula and national standards [11]. The utilization of the latest available technologies, such as cloud servers, 3D printing, Augmented Reality (AR), Smartboards, Video conferencing, FlipGrid, Hybrid Learning and Adaptive Learning Platforms, in science education has been widely applauded as innovative [12, 13]. However, most of the scholarly work on technology use in science education was studied or published prior to the pandemic, and many of these studies fold technology into traditional classroom teaching. The pandemic has created an immediacy for robust studies that prove the efficacy of fully online and hybrid science education.

Whilst some educators have been concerned about plagiarism rates and how online education may affect these rates, Ison [14] debunked the myth that online institutions and learning methods contributed to the prevalence of plagiarism. The study proved that traditional schools had more extreme cases of plagiarism compared to online institutions. Four years later, Ison published a paper on differences in plagiarism between world cultures, in which he showed that Western European students (including those in the UK) plagiarise far less frequently than their counterparts in India and China [15]. In this paper, he notes that cultural differences account for some of what would be considered plagiarism in the UK,i.e., ‘significant differences were found to exist between Chinese and Western scholars in their perceived requirement to acknowledge authorship of source documents’ [15]. Turnitin also needs to be considered for what it actually does, as a program, although most universities now require its use in grading as a shortcut to sorting out plagiarised from original papers. A recent study by Gallant, et al., describes the use of Turnitin in determining the originality of laboratory reports,this study found that most of the text flagged by Turnitin was reworded from the textbook, and could not be described as plagiarism (Gallant et al. 2019). Lab experiments are an intrinsic part of science education, and recreating the lab experience online has proved to be a popular challenge [16,17,18,19,20]. Several distinct factors such as reduction in equipment needs, availability at any time from anywhere, and the opportunity for students to learn at their own pace while exploring difficult or interesting concepts, have been highlighted as benefits of using virtual experiments in education [21]. These virtual laboratory sessions can be utilized by both undergraduate and postgraduate students as lab sessions follow the same principles and only vary in complexity depending on the level of studies [22, 23]. However, opinions on how to use virtual experiments in science and their impact on students’ learning outcomes have largely varied. In a review of recent advances, it was reported that students attain a deeper understanding of science when virtual labs are combined with real hands-on labs [20] and are shown to increase students' grasp of key laboratory techniques [24]. For instance, virtual labs have been used to carry out dangerous experiments or experiments impossible in real-life situations [25].

The responses of students to online labs in published literature reflect negative concerns. In a survey done to assess the experience of students who had participated in an online lab exercise, students lamented over missing out on specific lab activities they had long anticipated they would have participated in person. A major concern raised was the lack of access to their lab instructor during the exercise, although the instructor’s e-mail was readily available to the students [26]. Scheckler [10] asserted that the virtual lab experience removes participants from the reality of the physical lab, where specimens can be handled physically. Another negative aspect of online labs is the inherent technical problems such as website failure, access to the internet, use of specific technological tools and applications to access the lab, as well as accuracy and continued existence of hyperlinks associated with online labs glossary [10].

Overall feedback in published literature from students and attitudes towards virtual education has been positive. Students' reception of virtual laboratories has been positive [27] and they highlighted instant feedback, flexible access, and test–retest reliability as the major benefits of virtual education [28,29,30]. Similarly, students have reported increased engagement with laboratory materials and quizzes and knowledge progression during virtual labs in comparison to prior physical laboratory experiences [31]. Although virtual education has the potential to revolutionize face-to-face (F2F) learning and teaching in higher institutions [12, 13, 29], it has been criticized for lacking real-life feel that face-to-face education offers. Research has shown that face-to-face education plays a critical role in education, especially in science education [32, 33]. It has been hypothesized that with augmented reality, sensorial devices, live videos, and interactive videos, technologies used for virtual education can be improved to have a life-like experience while retaining all the benefits a virtual education offers [34].

From a technological standpoint, introducing virtual learning technology into the education process demands modifications of the existing protocols present in the traditional learning approach [11, 25, 35]. For instance, to input new learning content, educators must at least understand the underlying technology behind virtual learning technology [11, 25, 35]. This has shown to be particularly challenging as creating realistic virtual models of objects requires cooperation between experts on respective subjects and highly skilled programmers and graphic designers [11, 25, 35, 36]. To be most effective in bringing a traditional science course online, teachers must be trained in the newest technologies and have the time to create the necessary resources to make their virtual courses as successful as their face-to-face teaching.

While it is reasonable to be excited about all the innovative technologies revolutionizing education, it is essential to carefully consider whether virtual education is genuinely beneficial to students’ learning. Several studies have attempted to evaluate the specific benefits of virtual education, particularly virtual labs [36,37,38,39]. The effectiveness of student learning in both virtual and face-to-face education (laboratory learning activities) have produced contradictory results [36, 37], often due to the lack of a control group. Utilizing control groups to evaluate students’ performance in virtual and face-to-face education has produced contrasting results. For example, an investigation that utilized control groups suggested no significant differences in students’ performance in two learning formats, such as traditional and stimulated lab [36]. On the other hand, another study with a higher sample size proposed that virtual education significantly improved students’ learning outcome gains [39]. Other studies have also evaluated the benefits of utilizing virtual education in lecturing, assessment, and quizzes [39,40,41,42]. The emerging picture suggests there were positive achievements in students’ gains with regard to the classroom, but not laboratory work. Interestingly, these outcome gains were independent of class size and subject, and similar gains were achieved with the use of technology and non-technology-dependent techniques [39,40,41].

However, significant disagreement exists among science educators regarding the means and purpose of the laboratory component in science courses [36, 37]. This varying opinion has become the single biggest factor in the debate regarding the efficacy of non-traditional learning versus traditional learning [16, 18]. In a meta-analysis study of trends in virtual and traditional learning, it was revealed that before 2002, less than 70% of the published studies favored online education, while in studies published after 2003, 84% of the studies favored online education [43]. When focusing on empirical studies after 2005, there is a similar trend regarding favorability and support for virtual and remote learning. The majority of studies reviewed claimed that students’ outcome gains in virtual education were equal to or greater than achievement in face-to-face education [44].

Not only has virtual education become more prevalent in recent times, but it has also made it easier to accommodate and manage the increasing numbers of students enrolling in undergraduate and graduate programs [2, 11]. In a similar vein, the need to find alternative means of instructing students and assessing students’ performance has increased, as one teacher is insufficient to meet the needs of so many students [45]. Most significantly, there is also a need to ensure that these alternative means have the same effects and outcomes as attained in the face-to-face (F2F) methods of teaching, learning, and assessment, in light of the recent pandemic.

Our study aims to compare two education models in terms of student learning outcomes. Specifically, we compared groups of students, one receiving the lecturer completely virtual and the other attending the class in a traditional face-to-face (F2F) fashion. The study’s findings are expected to aid instructors who are contemplating providing programs using blended learning in the post-pandemic period.

3 Methods

3.1 Teaching module overview

This study is based on the teaching module named “Food and Microbes” of the School of Chemical Engineering at the University of Birmingham, class 2020/21. This module revolves around the current and existing knowledge of food microbiology, introducing students to the basic concepts of epidemiology and the control of infectious diseases, as well as factors affecting food spoilage, the survival of pathogens and the association of specific microbes with certain foods. A variety of teaching methods are employed in this teaching module. Although most of the course is delivered in lecture format, in practice this includes a mixture of formal teaching, case studies, practical exercises, and a laboratory practical. A feature of the course is the inter-relationship between pure and applied microbiology and its application to industrial processes and the understanding of food safety.

3.2 Participants

The 2020/21 class was composed of 30 postgraduate students. Due to the number of students (80% Chinese, 10% British, and the rest from Africa and America) and the experimental nature of the comparative groups, we consider this to be a case study. While 18 students could attend the lecture in person, the remaining 12 students were enrolled from China, and could not travel to the UK to attend the course due to COVID-19 pandemic restrictions. Therefore, the two groups were named face-to-face (F2F) and Remote, respectively. These circumstances forced the teaching sector to adjust and adapt the teaching approach, as well as students’ learning. Moreover, we considered these conditions ideal to record all possible information and compare the teaching and learning between the two groups. In this case, while F2F students were able to attend the class in person, the Remote group was attending the lecture from China via Zoom streaming. For the scope of this study, there was no further categorization (i.e. nationality, gender, etc.) of the students involved. The lecture was streamed live so that all students were participating at the same time. Moreover, all students had access to the teaching material on a dedicated Canvas page. Canvas is a popular learning management system (LMS) used by many universities and educational institutions around the world. It provides a platform for instructors to manage course materials, assignments, quizzes, discussions, and grades, while also offering students a centralized place to access course content, submit assignments, communicate with instructors and peers, and track their progress. Canvas is known for its user-friendly interface and robust features, making it a widely adopted choice in higher education. During the course, all the students participated in the formative quizzes, lab report and final assessment. All marks were uploaded on the Canvas page, which facilitated the collection of the data used for this study.

3.3 Data collection and analysis

We employed a quantitative analysis approach to evaluate the educational outcomes of online vs. face-to-face teaching methods. Data were collected through formative quizzes, summative lab reports, and final exams, and analyzed using independent sample t-test to compare the mean scores of the two independent groups (F2F and Remote) across different assessments. The choice of t-tests is appropriate for comparing the means of two groups when the data are assumed to be normally distributed.

The data of F2F and Remote students employed in this study were stored on the Canvas page of the course and collected through the tutor account. Data were exported and organised in an Excel spreadsheet. Subsequently, data were analysed with Prism GraphPad 9 software. Multiple comparisons with independent sample t-tests were performed. Levene’s test was performed using the average scores of quizzes, final exams, lab reports, and Turnitin scores for both F2F and Remote groups to assess the equality of variances. The test is used to assess the assumption of equal variances between the two groups, which is a necessary condition for conducting independent sample t-tests. Statistical significance was set at p < 0.05. In our analysis, we focused on comparing average scores, median values, and standard deviations (SD) across different assessments (quizzes, final exams, and lab reports) between face-to-face (F2F) and remote learning groups. A two-way ANOVA was conducted to explore the interaction effects between the type of learning and student performance metrics. This was utilized to examine the interaction effects between the type of learning delivery (F2F vs. Remote) and the students' performance metrics across different assessments. This approach helps to understand if the mode of delivery impacts the outcome variables.

The exam was conducted online by all students due to COVID-19 restrictions. The exam was timed and consisted mostly of open-ended questions to minimize the possibility of copying and pasting from external sources. Furthermore, question pools were utilised and set to randomise questions, ensuring that students do not answer a uniform set of questions. Students were given clear instructions on how to access the exam and how much time they had to complete it. The exam format and questions were reviewed by the instructors to ensure that they were relevant to the course content and could effectively assess student knowledge and understanding. In addition, Turnitin was used to check for instances of plagiarism, improper citation, or unoriginal content in student submissions. This software provides a similarity score indicating the percentage of text in the document that matches existing sources. This helps instructors ensure academic integrity and promote originality in student work. To ensure that the exam is taken by the students themselves and not a proxy, all students were mandated to join an exam Zoom meeting, leave their cameras on, but be muted throughout the duration of the exam.

3.4 Ethical considerations

While no formal approval was required due to the nature of the study, we followed strict protocols to ensure participant privacy and obtained informed consent from all students involved. Data anonymization was rigorously implemented; personal identifiers were replaced with unique codes, and demographic details that could potentially reveal participant identity were carefully obscured. The handling of data was conducted with utmost security—stored on encrypted servers with access strictly limited to authorized personnel, and secure protocols were employed for any data transfer. Informed consent was vital to each of the participant engagement process as they were thoroughly briefed about the study's aims, methods, and their rights, ensuring they understood their participation was entirely voluntary and could be withdrawn at any time.

4 Results

The students attending the module, “Food and Microbes”, were split into two groups based on the type of enrolment, F2F or Remote. During the course, both groups of students were marked via numerous formative quizzes to test their improvement, and then evaluated via a summative lab report and final exam. Two-way between group ANOVAs were conducted on each of these outcome scores—quiz average, summative laboratory report, and final exam—to determine whether there were differences in learning between online and F2F learning groups. In addition, the Turnitin score that is automatically provided by the application Turnitin was analysed to assess the degree to which online students vs. F2F students rely on plagiarism to complete their assignments. These data are presented in Table 1:

Table 1 Students results

The lab report and the final exam marks shown in Table 1 are the most important since they are essential to pass the class; students must score at least 50% to pass. For quizzes, F2F students had an average score of 83.52 with an SD of 11.26, while Remote students scored an average of 80.18 with an SD of 11.49. The difference was not statistically significant (p > 0.05). The lab report score average for F2F and Remote students was 61% and 65%, respectively. The average score for F2F students was 61.11 (SD = 16.50), and for Remote students, it was 64.67 (SD = 10.44). The difference in scores was not statistically significant (p > 0.05). The final exam score had similar results, with F2F students scoring an average of 79% and Remote students 75%. F2F students achieved an average score of 79.11 (SD = 10.55), compared to Remote students who scored an average of 74.61 (SD = 13.19), again showing no significant difference (p > 0.05). These scores were not significantly different. A notable finding was the difference in Turnitin scores, with F2F students averaging 26.72 (SD = 8.34) and Remote students 40.67 (SD = 8.36), indicating a significant difference (p < 0.05). F2F students have a much higher variance of the data and a statistically significant difference from online students in their Turnitin score. The Turnitin score calculated for the lab report shows a significant difference between the two groups of students. In particular, students attending virtually seem to have a higher level of plagiarism compared to the F2F. Both groups were also subject to several quizzes at the end of each section of this teaching module. Although F2F seems to have a slightly higher score on average, there is no significant difference between the two groups.

4.1 Correlation analysis

The correlation coefficient ranges from − 1 to 1, with values closer to − 1 indicating a strong negative correlation, values closer to 1 indicating a strong positive correlation, and values close to 0 indicating no correlation. From the correlation matrix, we can see that Lab Mark has a strong positive correlation with Exam Score (r = 0.862) suggesting that students who perform well in lab activities tend to score well in the final exam. This implies a consistency in performance across different types of assessments and a weak negative correlation with Turnitin score (r = − 0.199). This weak negative correlation (r = − 0.199) indicates that higher lab scores are slightly associated with lower plagiarism scores. This could suggest that students who engage more authentically with their lab work may be less likely to plagiarize. Turnitin score has a weak negative correlation with Exam Score (r = − 0.239). The weak negative correlation (r = − 0.239) implies that higher instances of plagiarism do not necessarily correlate with higher exam scores, suggesting that plagiarism may not benefit overall student performance. The results suggest that there is a positive correlation between Lab Mark and Exam Score, meaning that students who performed well in the lab also tended to perform well on the exam. However, there is a weak negative correlation between Lab Mark and Turnitin score, indicating that students who performed better in the lab tended to have lower Turnitin scores. The weak negative correlation between Turnitin score and Exam Score suggests that higher Turnitin scores may not necessarily be associated with better exam performance.

4.2 Levene’s test results

The F2F assessment had a mean score of 83.52 when conducted F2F and 79.11 when conducted remotely. The Remote assessment had a mean score of 80.18 when conducted F2F and 74.61 when conducted remotely. These results suggest that the scores were generally higher when the assessments were conducted face-to-face compared to remotely (Table 2).

Table 2 Comparative Analysis of Student Outcomes in Online and Face-to-Face Format (levens test result)

Levene’s test for equality of variances yielded a statistic of 0.397 with a p-value of 0.552. As the Levene's test statistic is 0.397, the value represents the magnitude of the difference in variances between the two groups. A smaller value indicates a smaller difference in variances, suggesting that the assumption of equal variances may hold. As the p-value is greater than 0.05, we fail to reject the null hypothesis, suggesting no significant difference in variances between the F2F and Remote groups. Therefore, the assumption of equal variances holds, validating the use of parametric tests for further statistical analysis.

5 Discussion

The statistical analysis indicates that there were no significant differences in between the F2F and Remote students’ performances on quizzes, the final exam and final laboratory report scores, except for the plagiarism scores. This outcome suggests that remote learning can be as effective as traditional classroom settings in terms of student academic performance. However, the higher plagiarism scores in the Remote group suggest that academic integrity could be a concern that needs to be addressed more rigorously in remote learning environments. The data suggest that there is no statistically significant difference between face-to-face and virtual delivery methods in terms of final marks in the Food and Microbes course. This indicates that virtual teaching could be employed as an alternative or even as the main approach to teaching. However, further research is needed to investigate the effectiveness of virtual teaching on students' well-being and other factors that may affect learning outcomes.

These findings are similar to the study of [36] which reported no significant difference between performances in a traditional and stimulated food chemistry lab. However, their study only focused on laboratory classes (a traditional hands-on lab and a simulated lab), whereas our study evaluated formal teaching, quizzes and laboratory practical classes. The only significant difference found between these two groups is in the Turnitin score, which implies that Remote students had copied more. Overall, both groups obtained higher marks on quizzes and the final exam, compared to the final lab report. This provides information about the impact of the examination modality. While quizzes and final exams are characterised by multiple-choice questions, the lab report must be written from the ground up. Students are provided with a standard template and general information about how to structure the report and what key information needs to be included. The lower score in lab reports obtained by both groups suggests that this examination modality can be more difficult for students. This could be due to the fact that students are more experienced in learning concepts and providing answers to specific questions, rather than structuring a scientific report. The latter cannot be easily drafted just by knowing the scientific concepts learned during the course. Instead, to write a report it is also necessary to analyse data, organise them logically and comprehensively, as well as writing all the sections ab initio. The lack of significant differences in academic performance between F2F and Remote groups supports the viability of remote learning as a comparable alternative to traditional classroom settings. The significant difference in Turnitin scores points to the need for enhanced strategies to promote academic integrity in remote learning environments. Institutions may need to implement more rigorous checks and balances or provide more education on academic ethics. The strong correlation between lab and exam performance underscores the importance of hands-on activities, even in a virtual environment. Educators should strive to integrate practical, application-based tasks into the curriculum to improve learning outcomes. The slight negative correlation between lab performance and plagiarism indicates that students who are more engaged with coursework may be less inclined to plagiarize. This suggests a need for personalized learning paths and support to enhance engagement and reduce academic dishonesty.

The Turnitin score associated with the lab report is a plagiarism indicator. Higher scores correspond to a high percentage of plagiarised text. In our analysis of Turnitin scores, we recognize that these metrics represent the degree of text similarity rather than direct evidence of plagiarism. Turnitin’s functionality as a similarity checking tool does not definitively distinguish between instances of properly cited work and plagiarized content. Therefore, while higher Turnitin scores observed in remote students suggest increased similarity, this should not be automatically equated with academic dishonesty. It is essential to consider the context of each similarity instance identified by Turnitin to make informed judgments about academic integrity. It is interesting to note that although students are not supervised while writing the lab reports, there is a statistically significant difference between F2F and Remote. This might be indirectly related to the longer distance between students and the institution. Being far away from the university could impact how students are committed to studying, increasing the tendency to copy rather than write lab reports on their own. We can speculate that this could be due to a weaker personal relationship with the lecturer. F2F have a direct and personal interaction with the lecturer, who they might not want to disappoint with a plagiarised report. On the other hand, Remote students might perceive the lecturer just as a virtual tutor, so they do not feel to create any kind of social connection. However, the differences in the Turnitin score could also be unrelated to the remote study approach and be due to other factors not measurable via this study. For example, students' nationality could have played a significant role in this context. As discussed earlier, the Ison study showed that students in China have a different point of view both to plagiarism itself and to those acts which could be considered plagiarism [15]. The entire online group consisted of Chinese students, and so the reasons discussed in Ison’s paper for plagiarism may apply to this cohort. In addition, other literature notes that Turnitin, whilst useful, tracks similarity, rather than plagiarism. Laboratory reports, as noted by the Gallant, et al., study, will have similarities due to the fact that certain sections of a laboratory report are essentially the same for all students (e.g., method) [46]. The influence of cultural and educational backgrounds on Turnitin scores warrants consideration, as highlighted by studies like [15]. Cultural perspectives towards academic writing and citation practices, particularly evident in diverse student populations, can significantly impact Turnitin similarity scores. For instance, in cultures where collective knowledge is valued, there may be a different approach to citing sources and conceptualizing plagiarism. Educational systems also play a role, with variations in emphasis on originality versus collective learning affecting students' writing practices. Therefore, observed differences in Turnitin scores between face-to-face (F2F) and remote students may stem from deeper cultural and educational influences rather than just physical distance. Instructors should recognize and accommodate these diverse perspectives, providing tailored support to foster academic integrity while respecting cultural differences in writing and citation norms. It is important to highlight that similarity scores are not definitive evidence of plagiarism on their own. Instructors need to review the highlighted similarities in context to determine whether they represent legitimate citations, quotations, or instances of improper copying. Additionally, some types of assignments, such as research papers, may naturally have higher similarity scores due to the inclusion of properly cited external sources. The teaching and learning methods described so far are not equal, however, our results show that they can be considered equivalent. Excluding the differences found for the plagiarism, both groups performed equivalently, demonstrating that it is possible to successfully teach and learn a scientific laboratory course remotely. Our study is limited to a specific subject and number of students. It is therefore necessary to study the effect of remote laboratory learning on a larger group and confirm our findings. Moreover, it would be interesting to compare our study with similar approaches in different subjects spanning across human sciences (i.e., history, philosophy, sociology, psychology, etc.) and more analytical subjects (i.e., mathematics, physics, chemistry, etc.).

Remote teaching is not something completely new. Even before the pandemic, many universities were offering 100% online courses, but the pandemic forced schools and universities to accelerate their intentions and improve the teaching methodologies and technologies related to remote teaching. This approach has already shown numerous advantages compared to traditional teaching, such as the possibility of delivering a lecture to a larger number of students without the need for a larger classroom, giving the possibility to students to learn from the most talented and awarded Professors, as well as studying with other students who are far away, without travel. Moreover, students have more flexibility in the way they learn due to the possibility of re-watching recorded lectures, saving the cost of transport to attend university lectures and labs, as well as a reduction in energy costs for the university infrastructures. One of the key challenges encountered during the remote teaching of this course was maintaining student engagement and active participation in the online learning environment. As highlighted by [47], student engagement is crucial for academic performance in e-learning settings. To mitigate this challenge, we employed several strategies drawn from the literature and our own experiences. Firstly, we intentionally designed interactive learning activities that fostered active student contribution, as emphasized by Rajabalee et al. [48]. These activities included online discussions, collaborative projects, and opportunities for peer feedback, which have been shown to positively impact student engagement and overall performance in online courses. Secondly, we focused on enhancing learner satisfaction by providing clear communication, prompt feedback, and easily accessible support resources. Rajabalee and Santally [49] underscore the importance of learner satisfaction in promoting engagement and performance in online modules. We maintained regular communication through multiple channels, offered timely feedback on assignments and queries, and curated a comprehensive set of online resources for students to refer to at their convenience.

Despite these efforts, we acknowledge that the remote learning experience may have presented additional challenges, such as technical difficulties, feelings of isolation, or distractions in home environments. Continuous monitoring, adaptation, and open communication with students were crucial in identifying and addressing these challenges as they arose. By implementing strategies to foster engagement, active contribution, learner satisfaction, and open communication, we aimed to create an effective and supportive online learning environment.

6 Conclusion

According to our study, there is no statistically significant difference between F2F and Remote students’ final marks at the end of the Food and Microbes course that we reported. Therefore, this is an indication that virtual teaching could be employed as an alternative or even as the main approach to teaching. This case study provided important information about the effectiveness of remote teaching in a postgraduate laboratory course. Although the Turnitin score for the two groups significantly differs, the score may not give a definitive understanding of the two groups of students' approach to plagiarism as the Turnitin tool detects only textual plagiarism. Also, the limited number of students involved limits our conclusions. It is important to extend the study to a larger group of students, testing the effects of the same course and its two delivery methods on learning and student performance. This could help to refine the results and provide more insights to improve the remote teaching strategies currently in place.

The Coronavirus pandemic strongly impacted on education at any level. This gave the chance to reshape the approaches to teach, particularly via remote teaching. Although the latter was already known and widely employed by universities and the private sector, the pandemic forced the education sector to rapidly improve and employ the currently available technologies to guarantee the best teaching quality possible. However, to assess whether remote teaching can be considered a viable and robust alternative to F2F, it is still necessary to further investigate its effectiveness on students’ performances as well as students’ well-being.

7 Future research

While the current study was limited to quantitative analysis due to pandemic-related constraints, future research will aim to incorporate qualitative methods such as interviews, focus groups, and possibly case studies in subsequent studies. This will allow for a more holistic evaluation of the educational approaches by capturing the intricacies of student experiences and perceptions that are not readily quantifiable. We believe that integrating both quantitative and qualitative data will provide a more comprehensive understanding of the effectiveness of different teaching modalities.

Furthermore, future research on diversifying assessment techniques holds the promise of enriching our understanding of student learning and engagement. By incorporating a broader spectrum of assessment methods, including formative assessments, project-based assessments, and peer evaluations, educators can gain deeper insights into the multifaceted nature of student progress. Formative assessments offer real-time feedback, allowing for adjustments in teaching strategies and student learning approaches. Project-based assessments encourage practical application of knowledge, fostering critical thinking and problem-solving skills. Peer evaluations promote collaborative learning and self-reflection, essential components of a comprehensive educational experience. Such research could explore the impact of these diverse assessment methods on student motivation, retention of knowledge, and overall academic success.

In the future we would like to explore additional factors influencing student performance in online and face-to-face settings, such as access to resources, level of support provided, and the nature of assessments, providing valuable insights into optimizing remote teaching strategies. Furthermore, conducting longitudinal studies to assess the long-term effects of remote learning on student outcomes and well-being would contribute significantly to understanding the sustainability and efficacy of remote education. Also, future studies could expand the scope by including multiple cohorts or institutions to increase the sample size and enhance the generalizability of the findings. We also aim to explore avenues for responsibly sharing data while protecting participant privacy and adhering to ethical guidelines. This would enhance the reproducibility of our findings and foster collaborative efforts within the research community.

Data availability

The data that support the findings of this study are not openly available due to reasons of sensitivity and are available from the corresponding author upon reasonable request.

References

  1. Bilecen B. Commentary: COVID19 pandemic and higher education: International mobility and students’ social protection. Int Migr. 2020;58(4):263–6. ArticleGoogle Scholar
  2. Eder R. The remoteness of remote learning: a policy lesson from COVID19. J Interdiscip Stud Educ. 2020;9(1):168–71. ArticleGoogle Scholar
  3. Kedraka K, Kaltsidis C. Effects of the covid-19 pandemic on university pedagogy: Students' experiences and considerations. Eur J Educ Stud. 2020; 7(8).
  4. Onyema EM, Eucheria NC, Obafemi FA, Sen S, Atonye FG, Sharma A, Alsayed AO. Impact of coronavirus pandemic on education. J Educ Pract. 2020;11(13):108–21. Google Scholar
  5. Martin F, Wang C, Jokiaho A, May B, Grübmeyer S. Examining faculty readiness to teach online: a comparison of US and German educators. Eur J Open Dist E-learn. 2019;22(1):53–69. ArticleGoogle Scholar
  6. Karalis T, Raikou N. Teaching at the times of COVID-19: inferences and implications for higher education pedagogy. Int J Acad Res Bus Soc Sci. 2020;10(5):479–93. Google Scholar
  7. Ilieva G, Yankova T, Klisarova-Belcheva S, Ivanova S. Effects of COVID-19 pandemic on university students’ learning. Information. 2021;12(4):163. ArticleGoogle Scholar
  8. Basilaia G, Dgebuadze M, Kantaria M, Chokhonelidze G. Replacing the classic learning form at universities as an immediate response to the COVID-19 virus infection in Georgia. Int J Res Appl Sci Eng Technol. 2020;8(3):101–8. ArticleGoogle Scholar
  9. Saha SM, Pranty SA, Rana MJ, Islam MJ, Hossain ME. Teaching during a pandemic: do university teachers prefer online teaching? Heliyon. 2022;8(1): e08663. ArticleGoogle Scholar
  10. Scheckler RK. Virtual labs: a substitute for traditional labs? Int J Dev Biol. 2003;47(2–3):231–6. Google Scholar
  11. Lewis DI. The pedagogical benefits and pitfalls of virtual tools for teaching and learning laboratory practices in the biological sciences. The Higher Education Academy: STEM. 2014.
  12. Ayega D, Khan A. Students experience on the efficacy of virtual labs in online biology. In: 2020 The 4th International Conference on Education and E-Learning. 2020.
  13. Faulconer E, Griffith J, Wood BL, Acharyya S, Roberts D. A comparison of online and traditional chemistry lecture and lab. Chem Educ Res Pract. 2018;19(1):392–7. ArticleGoogle Scholar
  14. Ison DC. Does the online environment promote plagiarism? A comparative study of dissertations from brick-and-mortar versus online institutions. J Online Learn Teach. 2014;10(2):272. Google Scholar
  15. Ison DC. An empirical analysis of differences in plagiarism among world cultures. J High Educ Policy Manag. 2018;40(4):291–304. https://doi.org/10.1080/1360080X.2018.1479949. ArticleGoogle Scholar
  16. Aljuhani K, Sonbul M, Althabiti M, Meccawy M. Creating a Virtual Science Lab (VSL): the adoption of virtual labs in Saudi schools. Smart Learn Environ. 2018;5(1):1–13. ArticleGoogle Scholar
  17. Darrah M, Humbert R, Finstein J, Simon M, Hopkins J. Are virtual labs as effective as hands-on labs for undergraduate physics? A comparative study at two major universities. J Sci Educ Technol. 2014;23(6):803–14. ArticleGoogle Scholar
  18. Hurtado-Bermúdez S, Romero-Abrio A. The effects of combining virtual laboratory and advanced technology research laboratory on university students’ conceptual understanding of electron microscopy. Interact Learn Environ. 2020;31:1126–41. ArticleGoogle Scholar
  19. Swan B, Coulombe-Quach X-L, Huang A, Godek J, Becker D, Zhou Y. Meeting the needs of gifted and talented students: case study of a virtual learning lab in a rural middle school. J Adv Acad. 2015;26(4):294–319. Google Scholar
  20. Sypsas A, Paxinou E, Kalles D. Reviewing inquiry-based learning approaches in virtual laboratory environment for science education. Διεθνές Συνέδριο για την Ανοικτή & εξ Αποστάσεως Εκπαίδευση. 2020;10(2Α): 74–89.
  21. Wong W-K, Chen K-P, Chang H-M. A comparison of a virtual lab and a microcomputer-based lab for scientific modelling by college students. J Balt Sci Educ. 2020;19(1):157–73. ArticleGoogle Scholar
  22. Bassindale T, LeSuer R, Smith D. Perceptions of a program approach to virtual laboratory provision for analytical and bioanalytical sciences. J Forensic Sci Educ. 2021;3(1).
  23. Sergeevna VY, Lilia SS, Nikolaevna VS, Suvonovich EY. Main Trends in the Organization of the Postgraduate Research and Education Process in the Context of Digitalization of Higher Education. In: 2022 8th International Conference on Energy Efficiency and Agricultural Engineering (EE&AE) (pp. 1–5). IEEE. 2022.
  24. Paxinou E, Georgiou M, Kakkos V, Kalles D, Galani L. Achieving educational goals in microscopy education by adopting virtual reality labs on top of face-to-face tutorials. Res Sci Technol Educ. 2022;40(3):320–39. ArticleGoogle Scholar
  25. Chen X, Song G, Zhang Y. Virtual and remote laboratory development: a review. Earth Space Eng Sci Constr Opera Challeng Environ. 2010; 3843–3852.
  26. Gilman SL. Do online labs work? An assessment of an online lab on cell division. Am Biol Teach. 2006; 68(9).
  27. Ghazali AR, Zainodin EL, Madhavan I, Gnanasundram LS, Nisar N, Abd Rashid R, Tang WW. Perception of online teaching and learning (T&L) activities among postgraduate students in Faculty of Health Sciences, Universiti Kebangsaan Malaysia (UKM). Front Educ. p. 166. 2022.
  28. Keeney-Kennicutt W, Winkelmann K. What can students learn from virtual labs? Committee Comput. Chemical Edu. 2013.
  29. Lynch T, Ghergulescu I. Review of virtual labs as the emerging technologies for teaching STEM subjects. In: INTED2017 Proc. 11th Int. Technol. Educ. Dev. Conf. 6–8 March Valencia Spain, 2017.
  30. Perez S, Massey-Allard J, Butler D, Ives J, Bonn D, Yee N, Roll I. Identifying productive inquiry in virtual labs using sequence mining. In: International conference on artificial intelligence in education. pp. 287–298. 2017, Springer, Cham.
  31. Jennifer GA, Thomas MG, Vijay Solomon R. Does virtual titration experiment meet students’ expectation? Inside out from Indian context. J Chem Educ. 2022;99(3):1280–6. ArticleGoogle Scholar
  32. Hofstein A, Mamlok-Naaman R. The laboratory in science education: the state of the art. Chem Educ Res Pract. 2007;8(2):105–7. ArticleGoogle Scholar
  33. Satterthwait D. Why are’hands-on’science activities so effective for student learning? Teach Sci. 2010;56(2):7–10. Google Scholar
  34. Wang J, Guo D, Jou M. A study on the effects of model-based inquiry pedagogy on students’ inquiry skills in a virtual physics lab. Comput Hum Behav. 2015;49:658–69. ArticleGoogle Scholar
  35. Baladoh S, Elgamal AF, Abas HA. Virtual lab to develop achievement in electronic circuits for hearing-impaired students. Educ Inf Technol. 2017;22(5):2071–85. ArticleGoogle Scholar
  36. Crandall PG, O’Bryan CA, Killian SA, Beck DE, Jarvis N, Clausen E. A comparison of the degree of student satisfaction using a simulation or a traditional wet lab to teach physical properties of ice. J Food Sci Educ. 2015;14(1):24–9. ArticleGoogle Scholar
  37. Barrett TJ, Stull AT, Hsu TM, Hegarty M. Constrained interactivity for relating multiple representations in science: when virtual is better than real. Comput Educ. 2015;81:69–81. ArticleGoogle Scholar
  38. De La Torre L, Guinaldo M, Heradio R, Dormido S. The ball and beam system: a case study of virtual and remote lab enhancement with moodle. IEEE Trans Industr Inf. 2015;11(4):934–45. ArticleGoogle Scholar
  39. Merchant Z, Goetz ET, Cifuentes L, Keeney-Kennicutt W, Davis TJ. Effectiveness of virtual reality-based instruction on students’ learning outcomes in K-12 and higher education: a meta-analysis. Comput Educ. 2014;70:29–40. ArticleGoogle Scholar
  40. Borst CW, Lipari NG, Woodworth JW. Teacher-guided educational VR: Assessment of live and prerecorded teachers guiding virtual field trips. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 2018.
  41. Dillon E, Tucker B. Lessons for online learning: charter schools’ successes and mistakes have a lot to teach virtual educators. Educ Next. 2011;11(2):50–8. Google Scholar
  42. Stull J, Varnum SJ, Ducette J, Schiller J. The many faces of formative assessment. Int J Teach Learn Higher Educ. 2011;23(1):30–9. Google Scholar
  43. Shachar M, Neumann Y. Twenty years of research on the academic performance differences between traditional and distance learning: Summative meta-analysis and trend examination. MERLOT J Online Learn Teach. 2010;6(2).
  44. Brinson JR. Learning outcome achievement in non-traditional (virtual and remote) versus traditional (hands-on) laboratories: a review of the empirical research. Comput Educ. 2015;87:218–37. ArticleGoogle Scholar
  45. Sriadhi S, Sitompul H, Restu R, Khaerudin S, Wan Yahaya WA. Virtual-laboratory based learning to improve students’ basic engineering competencies based on their spatial abilities. Comput Appl Eng Educ. 2022;30(6):1857–71. ArticleGoogle Scholar
  46. Gallant TB, Picciotto M, Bozinovic G, Tour E. Plagiarism or not? Investigation of Turnitin ® -detected similarity hits in biology laboratory reports. Biochem Mol Biol Educ. 2019;47:370–9. https://doi.org/10.1002/bmb.21236. ArticleGoogle Scholar
  47. Rajabalee YB, Santally MI, Rennie F. Modelling students’ performances in activity-based E-learning from a learning analytics perspective: implications and relevance for learning design. Int J Dist Educ Technol (IJDET). 2020;18(4):71–93. ArticleGoogle Scholar
  48. Rajabalee BY, Rennie F, Santally MI. The relationship between quality of student contribution in learning activities and their overall performances in an online course. Eur J Open Dist E-learn. 2018;21(1):16–30. ArticleGoogle Scholar
  49. Rajabalee YB, Santally MI. Learner satisfaction, engagement and performances in an online module: implications for institutional e-learning policy. Educ Inf Technol. 2021;26(3):2623–56. ArticleGoogle Scholar

Acknowledgements

We would like to acknowledge the University of Birmingham for their support and resources that contributed to this research.