Book (2022)/Open-Book Testing
- Open book testing
A notable contribution of the COVID-19 pandemic has been the rapid development of open-book testing. Open-book tests and exams allow educators to ask questions that cannot be answered based on access to information sources alone. They require higher cognitive skills, information retrieval and processing, and critical thinking instead of memorization. In many ways, the open book exam is closer to normal work experience. Open-book testing prepares students for work in the digital world[1].
It seems that supervision and restrictions employed to ensure a level playing field in remote testing are not the only possible path to fair online testing. This becomes especially true when stepped up checking by educators causes a reaction on the part of students who, feeling pressured, seek ever more sophisticated ways to bypass the restrictions. This changes cooperation between student and teacher into an unwanted “head-to-head” competition in cheating. This situation is especially exacerbated when testing in an online environment, where, in proctored testing, the supervision is very noticeable, and every flaw in the test security can easily lead to its invalidation.
Social and technological progress also plays a role. New students are “digital natives” and are used to working natively with new technologies. New communication and computing devices are becoming more compact and sophisticated. In the future, it will be almost impossible to prevent students from using them in distance exams, and it will be increasingly difficult to prevent them even in face-to-face exams. The use of online resources during an exam will thus become uncontrollable and their ban will be practically unenforceable. Radical restrictions, such as turning off data connections and mobile services throughout the country on the day of entrance exams, do not seem to be the right (or desirable) answer in our prevailing local conditions[2] .
To maintain a level playing field and preserve academic integrity, we can, in these new conditions, shift the focus of assessment from traditional (closed-book) tests to tests with open access to information. To no longer so much test knowledge itself, but shift attention to testing skills. We use classic tests out of inertia and often also just because we were not able to assess skills before – it's time to start changing that.
The transition to open-book testing means adapting the entire test agenda to the new situation. Classic testing allows you to ask questions focused on the recall of individual facts. If you are considering open-book testing, this means moving to the higher levels of Bloom's Taxonomy. Questions should be asked that require students to apply their knowledge to new situations and use analytical and critical thinking. For this approach to be fair to students, it is recommended to have students practice these more advanced cognitive skills sooner than when they will need them on a test.
It is extremely difficult to rework existing tests into the “open book” form. In practice, it is necessary to abandon all knowledge test items and develop new ones, at higher levels of Bloom's taxonomy, that would test understanding and skills.
- Benefits
One of the most challenging, but also most rewarding things you must deal with when switching to open-book testing is changing your perspective on academic education. If you are honest, you will admit that in your own work, you are constantly searching for various information, patterns, and details. These you then apply to a specific situation, synthesize them and gradually create your own work. Our students will do the same in the future and we should prepare them for it. We need to structure the open-book assessment to measure their ability to perform this application and synthesis, rather than testing their memorization of individual pieces of information that they will forget in a month.
So, can open book tests/exams help solve the problem of cheating in distance testing? It appears that they can, as recent systematic reviews have shown that closed- and open-book tests produce comparable results[1],[3].
But not only that. Open-book tests can help engage students in the processes of reflective and critical thinking. They also foster their digital literacy, critical thinking and lifelong learning processes, all important ingredients for graduates' future employability.
Zagury-Orly and Durning are not alone in thinking it likely that in the future we will see a hybrid model in which students are assessed using a combination of open-book and closed-book tests. The first part of the exam could be closed book and assess students on what they should know without looking at textbooks. The second parts of the exam (open-book) would focus on higher cognitive levels, on skills that are relevant to evidence-based practice[4],[5] .
- Risks
One of the risks of using open-book tests is that teachers may not initially know how to design effective test items that require critical thinking. Students may be lulled into the false notion that they will be able to look everything up during the exam and fail to properly prepare for it. There may be a false assumption that the exam will be easy and all the answers can be found in the textbook or from other authorized sources.
Even in an open-book exam, the teacher must define the scope of permitted sources of information, to maintain equality of opportunity.
Recommendations for creating questions for open-book tests
For open-book testing, open-ended item types that give students more space, such as constructed-response questions, will be particularly useful. Use story-based items that require students to apply critical thinking in response to a trigger scenario. Present the data to the students and ask what it might mean in the given scenario. What else could have affected it, how can it be verified, etc.
Here are some examples of open-ended items suitable for open-book testing (sorted according to Bloom's taxonomy levels):
- Application
- Arrange ... to demonstrate ...
- Analysis
- Identify the error in the proof or calculation
- Explain this situation in terms of theory...
- What are the counterarguments...
- Why is result A different from result B
- What is the relationship between X and Y?
- Synthesis
- Description of the experiment. What do you expect the result to be?
- Describe the next step in this process...
- Which method is best for this
- Which argument is the strongest?
- Evaluation
- Assess the situation under this state of criteria
- Evaluate, assess, recommend what would be better, ...
- What changes would you make?
- What would happen if...
- In questions, use wording such as: “what is most appropriate” or “what is most important”, which guides students in formulating judgments and stances.
|
Odkazy
Reference
- ↑ 1,0 1,1 SAM, Amir H., Michael D. REID a Anjali AMIN. High‐stakes, remote‐access, open‐book examinations. Medical Education [online]. 2020, 54(8), 767-768 [cit. 2021-11-27]. ISSN 0308-0110. Dostupné z: doi:10.1111/medu.14247
- ↑ Uzbekistán vypnul internet a SMS: Pro někoho možná radikálním krokem se rozhodli zakročit v Uzbekistánu proti korupci a podvodu. [online]. 2014, 8.8.2014 [cit. 2021-11-13]. Dostupné z: https://www.esemes.cz/magazin/uzbekistan-vypnul-internet-a-sms/
- ↑ DURNING, Steven J., Ting DONG, Temple RATCLIFFE, Lambert SCHUWIRTH, Anthony R. ARTINO, John R. BOULET a Kevin EVA. Comparing Open-Book and Closed-Book Examinations. Academic Medicine [online]. 2016, 91(4), 583-599 [cit. 2021-11-13]. ISSN 1040-2446. Dostupné z: doi:10.1097/ACM.0000000000000977
- ↑ ZAGURY-ORLY, Ivry a Steven J. DURNING. Assessing open-book examination in medical education: The time is now. Medical Teacher [online]. 2021, 43(8), 972-973 [cit. 2021-11-14]. ISSN 0142-159X. Dostupné z: doi:10.1080/0142159X.2020.1811214
- ↑ JOHANNS, Beth, Amber DINKENS a Jill MOORE. A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse Education in Practice [online]. 2017, 27, 89-94 [cit. 2021-11-27]. ISSN 14715953. Dostupné z: doi:10.1016/j.nepr.2017.08.018