Skip to main content
Pepperdine | Community

Remote Testing

Three students taking an exam on a large MAC desktop in a classroom

Should circumstances arise that require you to teach remotely, you may also find yourself needing to administer tests remotely. In the guidance that follows, the term tests includes quizzes, exams, and other assessments that can normally be delivered in the form of a document that students receive, complete, and return to the professor. Best practices for online projects (roughly conceived as activities culminating in a document that students submit or a performance that students enact) are available elsewhere.

Tools for Remote Testing

The Tests & Quizzes tool within Courses is the preferred tool for remote testing. It has been designed specifically for this purpose, complies with FERPA, provides for accessibility, and offers a range of pedagogical options. If you need any assistance setting up assessments in the Tests & Quizzes tool, Pepperdine's Technology and Learning team or your division's Technology Liaison can provide individual support.

Google Forms can also be used as a quiz platform, although Courses usually provides more robust options and analytics. Moreover, there is currently no direct way to import grades from a Google Form quiz into the Courses Gradebook. If you do choose to use Google Forms as a quizzing platform, activate the settings that limit responses to the Pepperdine domain and that automatically collect Pepperdine email addresses from respondents.

Some textbook publishers may provide their own online homework and quizzing platforms. The CTE, TechLearn, and your Technology Liaison cannot provide technical support for your use of these tools, nor certify their compliance with FERPA regulations and Pepperdine's privacy standards. Therefore, it is recommended that you use such tools only for low-stakes formative assessments and practice, not for summative assessments.

Designing and Administering Remote Tests

During a period of emergency remote course delivery, all tests become de facto take-home exams. Thus many of the same considerations apply to tests that you ask students to complete on paper and return via scan or regular mail and those that you ask students to complete online using Courses (or another tool that fits your needs). However, using Courses and Zoom enable some strategies that print-and-return tests cannot support.

 Instructor Presence

During an on-site test, the professor is generally accessible to the students for questions. To provide similar access during an online test, schedule the test for a synchronous session, host a Zoom meeting for the students during the same period, and use the private chat feature in Zoom to field individual students' questions. However, you should do this only if you can guarantee equitable access for all students.


Students who receive accommodations for onsite testing must receive equivalent accommodations for online testing. The common accommodation of extended time is easy to implement in the Courses Tests & Quizzes tool. Other accommodations may require additional guidance from the Office of Student Accessibility or assistance from individuals located near the student. Please consult the OSA for policy guidance and the CTE, the TechLearn team, or your divisional Technology Liaison for help with implementation.

 Time Limits and Deadlines

You can place time limitations on an online test or quiz just as you would for an on-site test. The best use of this feature is for synchronous tests that need to fit within a specific time frame, relative to Pacific Time, so that your test does not overlap with another scheduled synchronous class or test. .

However, there are many good reasons not to impose time limits on most asynchronous tests. Students' variable access to high-speed internet is a particular concern. Depending on conditions completely outside the student's control, such as the age of their device or the local internet infrastructure, students may experience significantly different load times for questions. If one student can load a new page in under five seconds but another student requires twenty seconds to load the same page because of slower internet service, the second student is at a disadvantage on the timed test.

On-campus tests are often given under time-limited conditions simply because the class must vacate the classroom for the next class to come in. Online tests, if given synchronously, must abide by the same protocol as a courtesy to students and to faculty colleagues. Asynchronous tests, however, are unbound by this consideration. Therefore, before imposing a time limit on an online test that you intend to deliver asynchronously, consider whether the time limit is really necessary to show that students have reached the desired learning outcomes. In other words, beware of unintentionally allowing items like "student types quickly and accurately" to sneak onto a "stealth rubric."

A risk of network interruptions accompanies any lengthy online experience, especially for students living in areas with poor internet infrastructure or those relying on cellular phone service. To reduce this risk, divide lengthy tests into shorter, individually segments (whether timed or not).

With our students living all around the globe, it's important to take time zone differences into account when scheduling online examinations. Holding the exam at the time it would have been held in Malibu keeps the exam from encroaching on other classes' scheduled activities. This practice is not equitable, however, for students living far from Malibu; taking an examination mid-afternoon local time can be markedly different from taking the same exam in the wee hours of the morning local time. Giving examinations asynchronously, within an 18 to 24 hour window starting at the time the exam would have started under on-campus conditions, should allow all students to take the exam under more equitable conditions.

 Academic Integrity

The logistics of remote testing and the affordances of the internet make it easier for students inclined to cheat to do so, and may present a greater temptation for students who normally would not cheat in on-site contexts. Adapting pedagogically to these circumstances is generally preferable to deploying technological and logistical countermeasures, although the latter can also be useful.

Effective strategies for promoting academic integrity in online testing vary depending on the type of questions you want to ask. Remote examinations are typically considered more amenable to questions at higher levels of cognitive complexity than to those at lower levels. Consider separating your remote exams into discrete assessments (for example, one for fixed-answer questions and one for free-response questions) in order to best take advantage of the strategies presented here.

For remote examinations targeting higher levels of cognitive complexity, consider the following strategies.

  • Require students to affirm an honor pledge at the beginning of the assessment. The Tests & Quizzes tool in Courses has a built-in honor pledge item that requires students to check a box affirming the statement, "I will neither give nor receive aid on this assessment." You might wish to write a pledge more specific to your course and your expectations in the form of a fill-in-the-blank question where the expected answer is the student's name: "I, _____, pledge that I will ..." Spell out your precise expectations with regard to the types of resources students may and may not consult. Statements like this trigger a psychological effect called "priming": reminding someone of ethical standards right before they make an ethically-loaded decision predisposes them to act ethically.
  • Aim for questions at a moderate to high degree of cognitive complexity, and ask students to justify their answers. Of course, doing so will likely increase the time required for grading, so develop a rubric that is easy to apply and align the questions tightly with the rubric.
  • Design your tests with the assumption that, regardless of your preferences, students will in fact have unrestricted access to the internet while completing them. (At present, Seaver College does not recommend or offer technical support for lockdown browsers.) Avoid asking questions whose answers can easily be found using internet search engines or quickly looked up in textbooks (unless navigating search engines and textbook indices promotes your course's learning outcomes); alternately, leverage the de facto open-book nature of remote tests and require students to cite sources that support their answers—preferably sources already assigned for the course.
  • If you have substantial time to design or revise your course, de-emphasize tests and put more weight on other types of assignments whereby students can demonstrate their progress toward achieving the course's learning outcomes. Put more emphasis on frequent, low-stakes formative assessment over infrequent, high-stakes summative assessment.

For remote examinations targeting lower levels of cognitive complexity, consider the following strategies:

  • Replace infrequent, high-stakes assessments with more frequent, low-stakes assessments. This practice helps to mitigate the motivation to cheat in the first place. The weightier an assessment, the more powerful the motivation to cheat.
  • Display questions one at a time, in random sequence, with no backtracking allowed, under a time limit. If you need certain questions to cluster together, put them in different parts of the assessment (or even in separate assessments) and randomize the sequence within each part. This practice helps to mitigate the effectiveness of consulting unauthorized resources during the exam.
  • Draw questions randomly from question pools. You could construct your question pools by topic and question type or, for better alignment with your learning objectives, according to a matrix coordinating the knowledge and skills you want students to acquire with the levels of a cognitive taxonomy like Bloom's. This practice helps to mitigate the effectiveness of back-channel sharing of questions and answers.
  • Randomize the sequence of answers on multiple-choice questions. This practice helps to mitigate the effectiveness of back-channel sharing of questions and answers.
  • Set an overall time limit for the assessment. This method does have drawbacks (see under "Time Limits and Deadlines" elsewhere on this page), but does help to mitigate the effectiveness of consulting unauthorized resources during the exam. Under ideal conditions of internet connectivity, a time limit of about 45 seconds per question is recommended. However, given the variability in student connectivity, 75 to 90 seconds per question may be more equitable.
  • Limit the amount of time an assessment is available to students. A 24-hour window (preferably starting when the exam would normally begin in a face-to-face context) will obviously accommodate students around the globe. If you know from where your students will take the examination, you can reduce the window.
  • Release feedback to students only after (but as soon as possible after, to promote learning) the submission deadline has passed.

In addition to these mitigation techniques, you can use time to completion as a rough diagnostic tool (see Lang 2013, 136). Courses records the time a student begins an assessment in the Tests and Quizzes tool, and the time they submit it. Courses also calculates the elapsed time. Students who take far longer than the average time (more than two standard deviations away from the mean, for example) might have taken longer because they were looking up answers that most students had memorized, and students who take far less than the average time might have worked very fast because they were given answers by a classmate who already finished the test. This time-to-completion factor should not be used as the only evidence for cheating, but can be an indicator.

The TechLearn team or your divisional Technology Liaison can show you how to implement any of these techniques in the Courses Tests & Quizzes tool.

At present, Seaver College does not recommend or offer technical support for third-party online proctoring services. The Zoom platform does give you the capability to visually monitor students while they are taking an online test, and your division's technology liaison can share with you the Zoom settings needed to make this kind of do-it-yourself remote proctoring work best. This technique also has the virtue of making you available to the students for questions, as previously discussed. However, this technique also risks creating an inequitable testing environment for students based on variables beyond your and their control, such as the quality of students' internet connections and even the effects of taking the exam at different times of day relative to the student's local time. Moreover, students subjected to this kind of proctoring may infer that their professor does not trust them, which may lead the students to view testing itself as an adversarial contest between student and professor—which in turn could actually spur some students to attempt cheating just to "beat the system."

Preparing Students for Online Testing

Before you administer the first graded online test in any particular course, give students a no-stakes practice test so they can become accustomed to the testing platform. On this practice test, include samples of all the different types of questions (multiple choice, fill-in-the-blank, short answer, matrix of choices, etc.) that you think you may wish to use later in the course.

For each online test you plan to administer, use the software's preview function so that you can see the test as students will see it. If your test includes embedded images, videos, PDFs, or other types of media, you might also consider enrolling a colleague (or yourself using a secondary email address) into your course as a student and asking them to complete each test with embedded media before you release it to students. Although a bit cumbersome, this method will reveal any problems with students' access to the embedded media resources (something that the preview function, at least in Courses and Google Forms, will not detect).

To help your students avoid technological issues during an online test, offer the following advice. (In the bullet points below, "you" refers to the student reading the advice.)

  • Use a wired internet connection if you can.
  • If you must use WiFi, use a private WiFi network if you can.
  • If you must use WiFi, try to reduce the number of devices accessing the same WiFi network simultaneously. For example, put your phone in airplane mode while taking a test or quiz on your laptop computer.
  • Never have Courses open simultaneously in more than one window or tab in your web browser. Failure to follow this guidance is the most common cause of data loss while taking an online assessment in Courses.
  • When navigating an online exam, don't use the browser back button. Instead, use the "Next" and "Back" buttons within the assessment interface.
  • Wait for each page to load completely. If you begin selecting answers before the page fully loads, you may lose work.
  • Write short answer or essay questions in a word processor or text editor, then paste your answer into Courses. This way, a backup copy of the work will be available if anything goes wrong.
  • Save your test answers frequently.
  • Don't forget to submit your exam when finished! In Courses, you must click "Submit for Grading" on two separate pages in order to complete the submission process. Read all of the text on the screen carefully so that you know which step you're on. You haven't turned in your test until you've clicked "Submit for Grading" on the second, confirmation screen and received a confirmation that you've submitted your assessment.

For more on tips and best practices for students, this documentation may be helpful.

Sources for this document include but are not limited to Lars Bengtsson (2019), "Take-Home Exams in Higher Education: A Systematic Review," Education Sciences (online); G. R. Cluskey, Jr., Craig R. Ehlen, and Mitchell H. Raiborn, "Thwarting Online Exam Cheating Without Proctor Supervision" (2011), Journal of Academic and Business Ethics 4; Arthur Dobrin (2013), "How to Keep Students From Cheating," Psychology Today (online); James M. Lang (2013), Cheating Lessons: Learning from Academic Dishonesty (Cambridge: Harvard University Press); and the websites of numerous other centers for teaching and learning.