Academic Integrity
Cheating is a chronic pain point for both users and developers of online courseware. Working on a team of courseware SMEs, I designed and tested several anti-cheating feature concepts that could be used across Pearson’s courseware portfolio. Two of the feature concepts tested well and won approval for additional investment.
Discovery
Various business units within Pearson had already done extensive, recent research on academic integrity before we began work. Rather than commission new research, we networked with colleagues in other divisions to acquire their findings. With a wealth of anti-cheating research in hand, we then conducted a review to identify some of the top concerns and problems regarding academic dishonesty.
Definition
From the insights generated during our discover & empathize process, we distilled out some How-Might-We’s around cheating prevention.
Ideation: Learner Scaffolding
Next we exercised our How-Might-We’s to generate some feature ideas. Here we did some rapid sketching around offering learners help while they worked on a question. This help, called scaffolding, involves breaking a question down into smaller parts and offering hints about solving the parts. We were also considering how a learner’s overall mastery of the learning objective at hand might factor in. Similar ideation exercises followed for the other features under consideration.
Ideation: Cheater's Honeypot
Some of my ideas for stopping cheating went beyond the UX arena. In information security, a honeypot is a resource attractive to an attacker that looks vulnerable but is in fact monitored by the system operators. Sites like Chegg have made a business out of helping students cheat, but their Achilles heel is user-submitted content. Pearson could shut down sites like this by submitting decoy answers to their own content and then looking for those decoys in test answers submitted by students. Decoy answers would be designed to be highly unlikely from a student who was actually attempting to do the work, so that only students who cheated would be likely to choose them.
Prototyping: Copy & Paste Detection
Sometimes the ideation process hands you an idea you’re deeply skeptical about. In this case, it was detecting keystrokes to flag when an answer was copied or pasted, and then warning the student. I felt this was too Big Brother, but there were good reasons to try it out. We knew from discovery that some students were pasting copied answers directly into our courseware. In testing, many professors liked this feature. But students hated it so much that we reconsidered it. Ultimately, an accessibility issue carried the day. There were legitimate reasons a disabled learner might paste an answer, so we didn’t go forward with implementing the feature.
Prototyping: Time Management
Our research indicated students often cheated when they felt time pressure to complete an assignment. To ease this issue, I redesigned the screen students encounter right before they launch an assignment. The old version presented a long list of all the questions in the assignment with a Start button. The new version chunks the assignment into smaller time intervals grouped onto cards, letting students choose a chunk to work. When the questions grouped on a card were completed, the card dimmed and moved to the right.
This feature tested well. Busy students who had previously felt overwhelmed by the assignment’s length now found getting started much easier.
Prototyping: Honor Code Reminder
Many universities have specific academic honesty policies, and often students are unaware of them. This feature takes a cue from psychology studies which showed that reminding people of laws increases compliance. Here, a simple popup displays a link to the university’s code when the student launches a quiz or test.
Prototyping: Time on Task Detection
A student with a list of answers doesn’t need to spend any time working out the problem. I leveraged the fact that we were already recording time on task as a diagnostic tool to intervene when students tried to cheat. If the answer is submitted faster than a threshold based on the average student’s time to completion, this dialog appears. But rather than a heavy-handed accusation of cheating, the copy gently suggests to the user that they might be rushing their answer.
Testing
We conducted two rounds of interviews and user testing on the prototyped features, using both student and instructor participants. We were able to recommend Pearson pursue several of the features, while discarding some that were found either ineffective or too invasive. The features implementing time management, instructional scaffolding, and time on task detection had been chosen by Pearson’s product council for additional investment at the time I signed off the project.