Online Exams in Language, Linguistics and Translation Courses During the Pandemic in Saudi Arabia

| ABSTRACT At Saudi universities, there was a sudden shift from face-to-face instruction to distance learning and assessment in March 2020 due to the COVID-19 Pandemic. This study explored the status of online exams in language, linguistics, and translation courses in the first two semesters of the Pandemic (Spring 2020 and Fall 2020). Analysis of faculty surveys and students’ comments on Twitter showed that the main concern of 91% of the students was final exams and passing courses with high grades. Students were worried about the negative effect of online exams on their GPA. Since the students were not familiar with online exams taken via Blackboard, they were anxious and wondered if they would do well. Some cheated on online exams as their cameras were turned off. Numerous adjustments were mandated by university administrations to alleviate students’ anxiety such as allocating 20% of the course mark to the final exam, allowing more exam time, giving projects, open-book exams, term papers, reports, assignments or giving a presentation instead of the final. Some instructors gave easy questions and were lenient in grading to avoid stu dents’ complaints. They gave no essay, just objective questions. The students were given the option to drop the course, to choose a letter grade, pass/fail, i.e., no grade, or to have a course mark included in their GPA. Based on faculty surveys, this study reports challenges of online exams during the Pandemic, design and delivery of online exams, forms in the first and second semesters of the Pandemic? (ii) Which platform was used for the online exams? (iii) Did instructors reduce the number of questions and change the exam quality? (iv) How did instructors prepare the questions and how did the students answer? (v) What problems did instructors and students face in online remote exams? (vi) How were the problems solved/treated? (vi) Has the distribution of grades changed? require the students to search for linguistic and translation key terms and concepts; give problem-solving questions, or conduct online debates. The students can summarize a research paper with different students summarizing different research papers. They can give project-based assignments. They may connect writing and speaking topics with the Kingdom of Saudi Arabia’s Vision 2030. The students may analyze translation errors in a text with different texts given to different students. They may conduct a linguistic analysis of family speech and videos.

& Logan, 2021). In addition, instructors lacked training in online assessment techniques (Montenegro-Rueda, Luque-de la Rosa, Sarasola Sánchez-Serrano & Fernández-Cerero, 2021). Arabic School Trainee Teachers in Malaysia lacked skills in using technology, content, and pedagogy (Sahrir, Hamid, Zaini, Hamat & Ismail, 2022). Kilickaya (2021) found five major challenges in teacher training programs during and after the COVID-19 Pandemic related to assessment type, assessment item formats, support, previous training (assessment literacy), and academic integrity. In South Africa higher education instructors needed adequate professional development on setting and administering online assessments that test adequate lower-and higher-order cognitive skills to sufficiently assess students' knowledge during online assessments and to ensure the reliability of the assessment outcomes (Mafugu, 2021).
Another line of research described best online assessment practices. Senel & Senel (2021) found assignments to be the most widely used tools, and formative assessment and feedback to be important in remote assessment. Students who interacted with their instructors and those who took online tests were found to be more satisfied with the assessment quality. Halaweh (2021) found that project-based learning assessment to be an effective alternative to online exams. Similarly, EFL teachers in Algeria used project-based language essays and summative assessment (Hichour, 2022). In Italy, Freddi (2021) employed book projects as an opportunity to reflect on the adjustments made to various planning and design factors informing language education during the Pandemic. Those can be generalized to language teaching, learning, and assessment in the global digital world. Gupta, Jankie, Pancholi, Talukdar, Sahu and Sa (2020) recommended use of open-ended short answer questions, problem-based questions, oral exams, and recorded objective structured clinical exams to assess students' knowledge and competence.
In South Africa, findings of a study by Aina and Ogegbo (2021) indicated that instructors used a combination of platforms such as Blackboard-collaborate, Google Classroom, WhatsApp, Kahoot, and with a wide variety of teaching and assessment techniques, such as small group work, collaborative learning, discussion posts, multiple choice quizzes, open-ended questions, essays, case methods, game activities, and chats utilized on these platforms. Those teaching and learning strategies helped increase interaction among the students, enabled immediate grading of scripts and provided feedback to the students. Koris and Pál (2021) applied formative assessment as an alternative to traditional summative assessment using tasks such as online presentations, online learning journals, blogs, creative writing, e-portfolios, and open-book exams which present good opportunities for the students to be part of the assessment process.
In the UAE, Alshamsi, Zahavich and El-Farra (2021) emphasized the role of graded assessment in supporting institutional accountability and transferability of student achievement, student efficacy and informed pedagogical alterations. They reported that the Higher Colleges of Technology were able to deploy an off-campus student assessment model that builds upon assessment development and deployment, technology infrastructure, and governance resilience to support student learning.
Mäkipää and Luodonpää-Manni (2021) recommended that teachers In Finland pay more attention to enhancing feedback practices and connecting with students during the remote teaching period.
In China, Zou, Kong and Lee (2021) identified three types of teacher engagement with online formative assessment in EFL writing during the COVID-19 Pandemi: Integral, auxiliary and disturbing, integrated with varied social, emotional, cognitive and physical investments in the formative use of ICT in writing assessment.
A third group of studies focused on the effectiveness of some types of technologies used for online assessment. For example, Darmawan, Daeni and Listiaji (2020) revealed that students were very responsive to Quizizz as an online assessment application during midterms, though the mean score was 43.08%, but the students accepted the results. Students' responses to a questionnaire about the use of Quizizz tended to be positive as 50% of the students gave positive responses to 8 statements. Quizizz presented the problem with ease, analyzrd the results of the answers to help the teachers in carrying out the assessment.
Findings of a study by Cheriguene, Kabache, Kerrache, Calafate and Cano (2022) showed a very high satisfaction rate (more than 90%) with the NOTA (Novel Online Teaching and Assessment) scheme using Blockchain technology to maintain the expected teaching quality and assessment fairness while respecting the course and examination schedule. NOTA also motivated both the students and teachers to carry on their efforts, even from home, through the Blockchain incentive strategies.
Kahoot, a learning platform and a type of formative assessment tool in the context of pre-university education was applied by Toma, Diaconu and Popescu (2021). The use of the Kahoot had a significant and direct positive impact on the educational process during the COVID-19 Pandemic.
In Australia, Lloyd, Sealey and Logan (2021) used the Academic Safety Net (an academic student support package) to balance the COVID-19 disruption and undergraduate learning and assessment. Bayesian hierarchical models were applied to compare students' Page | 16 performance in 2019 and 2020. The results showed that the enrolment component of the Academic Safety Net was an effective equity measure that enabled the students an extended the opportunity of self-withdraw in response to the general impacts of the pandemic; while the results component protected the integrity of the results awarded during the Pandemic.
In Saudi Arabia, there was a sudden shift from face-to-face instruction to distance learning and assessment starting March 2020 due to the COVID-19 Pandemic. In Spring 2020 and Fall 2020, both instruction and exams were held online. In Spring 2021, instruction was held online, but exams were held face to face. Starting Fall 2021, both instruction and exams were held face to face.
In Spring 2020 and Fall 2020, results of surveys by Al-Jarf (2020a) and Al-Jarf (2021f) showed that 55% of the university students and instructors were dissatisfied with distance learning (DL), online communication and had difficulty understanding online lectures. Results also showed absence of goals, low student motivation and engagement, low self-efficacy, and a negative role in online classes. Many students were not interested in online learning, in doing assignments, did not participate in online class discussions, refused to give oral presentations, preferred to use lecture recordings rather than attending online classes. The students' major concern was final exams and passing their courses with high grades. Many instructors and students were not technically prepared for this abrupt transition as some did not have devices and Internet access.
Moreover, the literature review showed lack of studies in Saudi Arabia that explored the status of online assessment during the Pandemic at Saudi higher education institution and their effect on learning and students' morale during the first two semesters of the Pandemic. Therefore, this study aims to explore the status of online exams in language, linguistics and translation courses during the first two semesters of the COVID-19 Pandemic (spring 2020 and Fall 2020) at a sample of higher education institutions in Saudi Arabia, to report the challenges of online exams, design and delivery of online exams, assessment forms and choices, course result options given to the students, grade inflation issues, and lessons learned. Specifically, the current study aims to answer the following questions: (i) How did the students take their monthly and final exams in the first and second semesters of the Pandemic? (ii) Which platform was used for the online exams? (iii) Did instructors reduce the number of questions and change the exam quality? (iv) How did instructors prepare the questions and how did the students answer? (v) What problems did instructors and students face in online remote exams? (vi) How were the problems solved/treated? (vi) Has the distribution of grades changed?
The study will shed light on the common online assessment approaches used during the Pandemic and the challenges faced, and the adaptations made to the virtual delivery and administration of online assessments and the effects of online assessment on students' learning. It also sheds light on the types of online support needed for the students and instructors.

Subjects
A total of 95 students and 25 instructors from colleges and departments of languages and translation at 5 Saudi universities (King Saud University (KSU), Princess Noura University (PNU), Imam Abdulrahman bin Faisal University (IABFU), King Abdul-Aziz University (KAAU) and King Khalid University (KKU) participated in the study. All the participants are native speakers of Arabic with English as their second/foreign language. Participating instructors have an M.A. and/or a Ph.D. degree and they teach different language, linguistics, translation, and interpreting courses. The participants were all females. Student participants were all undergraduate students (freshman, sophomore, junior, and senior). They had different proficiency levels in English and different background knowledge in linguistics, translation, and interpreting.

Data collection
To explore students and instructors' experience with online exams at Saudi Universities during to first two semesters (Spring and Fall 2020) of the Covid-19 pandemic, a questionnaire-survey with open-ended questions about online exams was developed and distributed via WhatsApp. The survey asked instructors the following questions: (i) How did your students take their monthly and final exams in the first and second semesters of the COVID-19 pandemic? (ii) Which platform did you use for online exams? (iii) Did you reduce the number of questions and change the quality of the questions? (iv) What kind of questions did you ask (multiple choice, matching, essay…etc)? How many questions did you give? (v) How did you prepare the questions and how did the students answer? (vi) What problems did you and your students face in online remote exams? (vii) How did you solve/manage the problems? (vii) Has the distribution of course marks changed? (viii) What is your view of the results of online exams?
Secondly, a sample of 150 tweets about students' complaints and comments on final exams during the first semester of the COVID-19 Pandemic was collected from Twitter.
Thirdly, final grades of a sample of courses taught online in Spring 2020 were obtained from the instructors in the sample.
Fourthly, university mandates about final exams in the Spring semester 2020 were obtained from university pages on Twitter and from local newspapers.

Data Analysis
Instructors' responses to the questionnaire-surveys were categorized according to the challenges of online assessment, design and delivery of online exams, assessment forms and choices, and course result options given to the students. Responses to the questionnaire-surveys will be reported quantitatively and qualitatively.
Students' tweets were analyzed to find out their views of online exams, the kinds of problems that they had with online exams and their preferences of end of course assessments.
The grades obtained for the sample of courses were compared with language, linguistics and translation course grades reported in other studies by Al-Jarf to find out if any grade inflation exists due to the Pandemic.
For reliability and validity purposes, a sample of students and teachers' responses to the questionnaire-survey were analyzed, categorized and quantified by a colleague with a Ph.D. degree in using the same 4 components of student agency and their definitions. Both analyses were compared, and variations were resolved by discussion. The percentage of agreement between the two analysts was 96%.

Status of Online Assessment in Spring 2020
Findings of the current study showed that in Spring 2020 (first semester of the Pandemic) 55% of the students and instructors in the sample were dissatisfied with distance learning and found it ineffective and frustrating and preferred face-to-face instruction. No marks were allocated to online class attendance as it is usually the case in face-to-face classes. The main concern of 91% of the students was final exams and passing courses with high grades and the negative effect that online exams might have on their GPA. They complained about grading and marks on social media. Some requested automatic passing, waiving of final exams, postponement of final exams, to be given projects, reports or research papers instead of written/oral exams, or to duplicate the marks of the first half of the semester. Since the students were not familiar with online exams taken via the Blackboard platform, they were anxious and wondered if they would do well and obtain good course grades. In their tweets, some students declared: The students felt that online exams were a burden, were under pressure and expressed their inability to cope with the requirements of online exams especially that final exams for the Spring semester 2020 were scheduled in the holy month of Ramadan.
Most university instructors used Blackboard for exams. 69% of the instructors thought online exams were ineffective. Some students cheated on exams. Some instructors had doubts about whether some students took the tests themselves or someone else took the test on their behalf as the students' video camera was off during the exam. Some students overslept, logged into the platform late, started their online exam late. The instructor had to give them extra time. It is like giving the exam over several times not once.
Both students and instructors had technical problems with Blackboard during online exams and frequent disconnections which required re-logging in. Both students and instructors had no prior experience with online exams and most had no prior experience using Blackboard. The instructors stated that they were overwhelmed by online instruction and online exams and had no prior experience in preparing and conducting online exams, in online assessment techniques, in grading online exams and reporting their results, how to prevent cheating on online exams as they never received prior training in online instruction and online assessment.
The above findings are consistent with findings of other prior studies conducted in other countries such as Nguyen, Keuseman, and Humston (2020); Meccawy, Meccawy and Alsobhi (2021); Montenegro-Rueda, Luque-de la Rosa, Sarasola Sánchez-Serrano and Fernández-Cerero (2021) and Verhoef and Coetser (2021) that reported problems of cheating, dishonesty and misconduct due to lack of monitoring, lack of time management, feeling overwhelmed and stressed and struggling with technology. As in the present study, some students in Slack and Priestley's (2022) study reported that online assessment requires more effort compared to traditional techniques.
Lack of training in online instruction and online assessment reported by instructors and students in the current study is also consistent with results of studies by Montenegro-Rueda, Luque-de la Rosa, Sarasola Sánchez-Serrano and Fernández-Cerero (2021) and Sahrir, Hamid, Zaini, Hamat and Ismail (2022). Lack of training that instructor had in the assessment types, assessment item formats, lack of support, previous training in online assessment and academic integrity reported by instructors in the present study are similar to those reported by Mafug (2021) and (Kilickaya (2021).
In South Africa higher education instructors needed adequate professional development on setting and administering online assessments that test adequate lower-and higher-order cognitive skills to sufficiently assess students' knowledge during online assessments and to ensure the reliability of the assessment outcomes (Mafugu, 2021).

Adjustments to Online Exams Mandated By University Administrations (SPRING & Fall 2020)
Due to students' complaints and anxiety about online exams and course grades and GPA, university administrations in Saudi Arabia issued several directives and adjustments regarding online exams during the Pandemic. They mandated that only 20% of the course mark be allocated to final exams instead of 50% during face-to-face instruction. They mandated that instructors allow more exam time, give easy and straightforward questions, rather than higher-level thinking questions and be lenient in grading. Instead of a written final exam, instructors could give projects, open-book exams, term papers, assignments or the students may give a presentation. Some instructors gave no essay questions, just objective questions (multiple choice, true false, matching and gap filling). The students were given the option to drop the course, to choose a letter grade, a pass/fail result, to have a course mark included in their GPA or the course mark to be waived.
These results are partially consistent with findings of a study by Lloyd, Sealey and Logan (2021) in Australia, which revealed that more students withdrew from courses in 2020, while fewer students remained enrolled but failed.

Status of Online Assessment in Fall 2020
In the summer of 2020, instructors attended workshops about online instruction and assessment. Platform and internet connectivity problems were solved. For 45% of the instructors, online exams in the Fall 2020 went smoothly with no problems as in Example 1 below:

Example 1:
Instructor A had no problems with her Speaking tests. She prepared different sets of questions using Google Forms. The students recorded their presentation within a limited time. If they took longer, they would be logged off. The students sent their recording to the instructor via email. She did not make the questions easier because she works very hard in teaching. The students did not cheat because she did not notice any similarities in their answers. The marks that the students obtained were lower than normal days. As mandated by her university, she allocated 20% of the course mark to the final exam and 80% to semester work.
On the other hand, 55% gave a variety of tasks on their final exams in language, linguistics courses, had some challenges and they described how they faced those challenges. The following are examples of assessment procedures in a variety of courses from different universities:

Example 2:
In a Speaking course at another university, Instructor B required the students to give a presentation using Blackboard. Grading presentations was a problem for her. Blackboard randomizes the questions and topics. Students cannot go back to change their answers to previous questions. She reduced the final exam time from 2 hours to 1 hour and allocated 5 minutes to each question/topic. When the exam starts, she would log in in order to find out which students started the exam and those who did not. She indicated that some students did not start their exam right away. They would wait for their classmates to finish. After 10 minutes they would start answering the questions with their classmates on the mobile. She would ask each student via WhatsApp why she had not started the exam. The students complained because other instructors did not reduce the final exam time as she did.
She added that in speaking, some students wrote the presentation on paper and read out loud. She deducted marks for reading instead of talking. She forced the students to speak, and if they did not, she would mark them absent. She allocated 10 marks for class participation. Students complained, argued, denied and were not happy with their marks. WhatApp helped her a lot in communicating with the students.
She noted that her brother and his classmates have a cheating group.

Example 3:
In an online listening course exam at a third university, Instructor C played a recording. The students sent their answers to the instructor via WhatsApp at the same time. When students take the listening exam on Blackboard, they have time to ask their classmates. To prevent cheating, she gave the students 5 seconds only to send their answers to her. After that, she did not accept answers and would cancel answers received after few minutes.

Example 4:
Instructor E prepared 150 questions for the Semantics and Pragmatics course final exams. Blackboard selected 50 questions and assigned different sets of questions to different students. She cancelled essay questions. She used higher level thinking questions and no recall questions. The questions were not from the textbook. The students could not go back to change their answers. She allocated 1 minute to each question. In some cases, if the students did not like the questions, they would not answer the first 5 questions very well. They would close the page or claim that the system was slow or that they were having technical problems. Too many users (students) at the same time were taking their exams on Blackboard, as a result the students had frequent disconnections. Instructor D thought that online exams were easy (a king of play) for the students. She pointed out that some instructors were lenient in grading, and ignored grammatical and spelling mistakes, i.e., did not deduct marks for mistakes. Some students cheated even in writing exams. The students use WhatApp to exchange answers and get comparable grades.

Example 5
Since instructors had a choice to give a written final exam, a take-home exam, a project, a report, or a paper for 20 marks (20% of the course mark). In her Applied Linguistics, Instructor E asked the students to record an interview with a relative about a life experience on WhatsApp, prepare a video report and send it to her via the Blackboard platform. She gave timed objective questions. She allocated ¼ point to true false questions, 1/2 a point to matching and 2 points to multiple choice questions. 15 marks were allocated to the Midterm exam.
The students complained of reducing time and not being able to go back. The students complained that the difficulty level of the exams did not match the difficulty level of the material. Some instructors gave difficult questions to minimize cheating. When she saw that the university was lenient, she changed her procedures. In Semester 2 of COVID-19, she gave 10 quizzes with 4 questions in each quiz and many listening assignments.

Example 6
At King Abdul-Aziz University, Instructor F gave the Writing Exam via QuestionMark and the final exam via CBT. The Speaking exams were like face to face but virtual. She allocated 20 marks for the midterm, 30 marks for the writing final, 10 marks for the speaking final, and 40 marks for the CBT Final. She gave daily assignments and progress quizzes (formative assessment) on Blackboard.
In the case of graduate students, assessment was flexible. Instructors could give Open Book Exams, writing a critical review, giving a poster presentation or a Powerpoint presentation. The department mandated that at least 3 types of assessment should be given. More students passed during COVID-19 and very few failed.

Grade Inflation in Spring 2020
Results of the course grades in Table 1 show that all the students passed their exams with a grade of A+, A, B+, B. No C. D & F. Table 1 also shows that 34% of the students in all the 14 courses were awarded a grade of A+ and A; 12.6% got a B+ and B; and only 2% got a C+ and C. Interestingly, 58% of the students in the 14 courses chose to pass their courses with no grade. In 78.6% of the courses, between 14% and 100% of the students (median=92%) chose a no grade for their courses.

Page | 20
The instructors in the sample reported that the students did not feel they had to work hard. They added that the university administration wanted all the students to pass to avoid students' grumbling and complaints. Some instructors felt obliged to be lenient and give easy questions, as the administration mandated that they make it easy for students. In addition, the instructors reported that some students got an A without deserving it. After the Pandemic, the students started to ask for an A and for passing with honors. Some students with a GPA of 4.5/5 cannot construct a sentence. Other students complained about their instructors when they did not get an A. Some instructors thought that the university the 20 marks allocated to the final exam were like a gift to the students to alleviate their anxiety and make them feel better.
Moreover, the instructors pointed out that grade inflation in online exams during the Pandemic was higher than face to face exams before the Pandemic. Many students passed with high grades (As and Bs), few Cs and no Ds and Fs. An instructor declared that 7 students in her course got an A during the Pandemic compared to one student before the Pandemic.
Although grade inflation is a global phenomenon, grade inflation in online language and linguistics exams during the Pandemic at the sample institutions in the current study is even higher than grade inflation in language and translation courses reported in studies by Al-Jarf (2022g), Al-Jarf (2021a), Al-Jarf (2002a), Al-Jarf (2002b), Al-Jarf (2001). In those studies of 70 language and translation courses, the author found that 44% of the students scored between 90-100 (Grade A+ and A), 27% scored between 80-89 (Grade B+ and B); 14% scored between 70-79 (C+ and C); 5.7% between 60-69 (D+ and D); 1.6% between 50-59 marks and less than 1% failed the course. The percentage of students who obtained grades A and B (41%) and the pass rate (99%) reflect obvious grade inflation. In Table 1, grades were clustered in the A+, A, B+ and B category.

Recommendations
Findings of the present study shed some light on online exams in language, linguistics, and translation courses during the first two semesters of the COVID-19 Pandemic (Spring 2020 and Fall 2020), the challenges of online exams, design and delivery of online exams, assessment forms and choices, course result options given to the students, and grade inflation issues. Results showed that in general, instructors felt helpless, did not seem to know the principles of designing effective language, linguistics and translation tests, higher and lower test skills and that effective tests do not depend on the test item format but on the types of tasks required by the test items. Instructors and students must realize that we do not teach for the test and that the aim of the teaching-learning process is not to pass the exams, but to achieve the learning goals.
No matter what the situation is, the annual assessment requirement should not be waived; instead, universities must adapt and be responsive to the nation's new reality and must create a plan for scaling back the assessments, not eliminating them (Jimenez, 2020).
To overcome the common problem of cheating in online exams, instructors and universities should raise students' awareness and ethics, train teachers in detecting cheating techniques utilized by the students and activate a code of practice and apply severe sanctions on students who engage in cheating (Meccawy, Meccawy & Alsobhi, 2021). Fairness and integrity can be ensured by using technological tools such as video and audio recording surveillance (Gupta, Jankie, Pancholi, Talukdar, Sahu & Sa (2020). To minimize online cheating for online assessments, the assessment format can be modified in a way that minimize or discourage cheating (Nguyen, Keuseman, & Humston, 2020).
To alleviate students' anxiety towards online exams, in general, and online exams, in particular, instructors should raise students' awareness of online exam procedures and give them a chance to practice online exam procedures before the actual final exams are held, give the course objectives that the students should achieve and the skills to be mastered at the end of the course. Instructors should provide guidance in the form of "do's" and "do not's" for higher education (Jankowski, 2020).
Furthermore, academic staff need to scaffold online learning and assessment methods in the curriculum, i.e., instructors should break up the learning task into chunks and provide a tool, or structure, with each chunk (Slack & Priestley, 2022).
Since instructors were overwhelmed with online instruction and online exams during the Pandemic due to lack of training and prior experience, training and re-training of instructors and students, and the provision of virtual learning enabling infrastructure, are recommended to mitigate similar situation in future (Sasere & Makhasane, 2020). Instructors need some kinds of assessment support during exam periods as well.
Training instructors in online assessment should include question preparation, test administration, monitoring, grading, communication with students before, during and after online exams. Al-Jarf (2022m) found that instructors' qualifications, the pedagogical system, educational and professional experience, the integration of online instruction, the type of instant feedback given to the students and the formative assessment techniques used to be significantly effective and important for enhancing the grammatical knowledge and writing quality of unskilled, low ability EFL students and resulted in a significant improvement in their grammar and writing post-test scores.
Higher education instructors need adequate professional development on designing and administering online assessments. Online assessment should test adequate higher-and lower-level cognitive skills for sufficient evaluation of the students' knowledge and skills. A variety of assessment methods and a diversity of tasks must be used to ensure the reliability of the assessment outcomes (Mafugu (2021). To improve assessment of students' learning outcomes in online instruction, this study recommends the following:

•
Setting standards for passing courses that should be taken into consideration during Pandemics and non-Pandemic situations. These standards should include the content and skills that the students should acquire in a Pandemic and non-Pandemic situation, and in online and face-to-face instruction. In designing reliable, valid, discriminating and effective exams, instructors should specify in detail the course content to be covered by the test and the specific skills that the test will measure.
• On interpreting tests, the instructors may assign different podcasts or TED talks to different students in the same test session and require them to interpret them form L1 to L2 and vice versa (Al-Jarf, 2021i).
• On linguistics exams, the instructor may require the students to search for linguistic and translation key terms and concepts; give problem-solving questions, or conduct online debates. The students can summarize a research paper with different students summarizing different research papers. They can give project-based assignments. They may connect writing and speaking topics with the Kingdom of Saudi Arabia's Vision 2030. The students may analyze translation errors in a text with different texts given to different students. They may conduct a linguistic analysis of family speech and videos.
They can create podcasts and digital stories. They can integrate technology such as Slido and Padlet to deliver the tests (Al-Jarf, 2022i; Al-Jarf, 2022h; Al-Jarf, 2021d).
• On oral exams, the instructor can assign a topic which the students can research and prepare at home and then give an online oral presentation about it online through the platform; using online debates about some issues; answering problem-solving questions; student-created podcast on a topic of their choice and publishing them in a Speaking Center on Twitter; combining listening and speaking activities; using Vicaroo, a free online audio recording creator to record conversations and presentations, and using the Kahoot app and others (Al-Jarf, 2021c; Al-Jarf, 2021b).
• For online reading tests, different students can read different texts and identify the main ideas of the paragraphs, write the topic sentence of each, make an outline, write a summary, identify the text structure and so on (Al-Jarf, 2009; Al-Jarf, 2007a).
• For writing tests, the instructor can integrate participation goals that require students' involvement in social and civic issues. They may write about topics related to local and global social, educational, health, political and/or technological issues with which the students are familiar and to which they can relate. The students can describe a problem, its causes and suggest solutions to it (Al-Jarf, 2021e). They may write about current global events and COVID-19 issues (Al-Jarf, 2022b; Al-Jarf, 2022c; Al-Jarf, 2014).

•
On vocabulary tests, students should be given tasks in which they make multiple associations such as tasks in which writing the silent letters in a sample of word; writing the part of speech some words; circling the words in which -er is not a suffix; underlining the words that have no singular form; giving a synonym; giving the meaning of some word in Arabic; giving the American expression for some British expressions; filling in the blanks with a preposition; showing the difference between pairs of sentences by giving the Arabic meaning; selecting the appropriate idiom/collocation, providing situations and asking the students to provide an expression of 'apology'. They may be required to give the singular and plural form, and make derived nouns, verb and adjectives using appropriate suffixes (Al-Jarf, 2022a; Al-Jarf, 2022e; Al-Jarf, 2022l; Al-Jarf, 2019b; Al-Jarf, 2015; Al-Jarf, 2008).
• On grammar tests, the instructor gives questions that require the student to produce grammatical structures in context such as (i) Read the following story, then write 10  (vii) Write a story or summarize a movie using the historical present and so on (Al-Jarf, 2005; Al-Jarf, 2017b). A variety of linguistic landscapes can be given to different students with questions that require them analyze and give the meanings of certain lexical items and grammatical structures in the linguistic landscape (Al-Jarf, 2021h).
• Listening, speaking, reading, writing, grammar and vocabulary tests should be based on a list of skills that have to be mastered and the content details covered in the classroom, whether face-to-face or online, and for general or specific purposes. A table of specifications that specifies the number of items allocated to each skill and each content detail with marks allocated to each should be built. For reliable tests, the test items should adequately sample content covered and the skills practiced over the whole semester (Al-Jarf, 2021j; Al-Jarf, 2015; Al-Jarf, 2013; Al-Jarf, 2011).
• Instead of giving a single end-of-course final exam, instructors can use formative assessment where a quiz is given once every 2 weeks to assess the students' mastery of the skills and content covered over the 2 weeks and to diagnose students' problems before moving on to new material or skills (Al-Jarf, 2005; Al-Jarf, 2004).
• In vocabulary and grammar tests, the instructor can create several parallel versions of the same test by randomizing the order of the questions and randomizing the order of the lexical items within each question.

•
The students may use mobile flashcards to prepare for standardized tests such as the TOEFL, IELTS and GRE (Al-Jarf, 2021g).
• To assess students' online practicum, instructors may ask the students to perform online microteaching, i.e., peer teaching where a student performs as a teacher and her classmates acting as students, online simulated teaching, i.e., a kind of role playing where the student performs the role of a teacher or any other role without any preliminary training or rehearsal. They may also perform remote hands-on teaching (Al-Jarf, 2022f).
• When conducting an online thesis defense, the examiners should make comments on the linguistic, organization, and methodological aspects of the thesis such as statement of the problem and questions of the study, the literature review, selecting the subjects, designing the research instrument, data collection and analysis, reporting and discussing the results and the recommendation given (Al-Jarf, 2022k).
Finally, future research should continue to follow the status of online assessment, procedures, and difficulties after the Pandemic to make further improvements and ensure that exams test the required learning outcomes.