Higher Education Research and Development Society of Australasia

The COVID-19 pandemic accelerated the use of online assessments, and they’ve remained popular because they offer flexibility. Lecturers are also able to incorporate multimedia into the assessment, and of course, online exams minimise paper waste.
Online exams also come with disadvantages. Students have reported increased anxiety around the potential for internet issues or technical problems, and the use of computers for exams may also add to cognitive load. But potentially the biggest downside of online exams now is the impact of generative AI on academic integrity. An estimated 83% of Australian university students are using generative AI platforms. Concerns about cheating on assessments using generative AI are on the rise, with anecdotal reports that cheating in online exams is particularly easy.
In my research skills unit, which includes a focus on statistics, I replaced the online exam with a totally different type of online assessment: the digital decision index.
So, what is a digital decision index? My students didn’t know, and AI didn’t know either, when the students tried to ask! (I’m sure it will learn soon).
For me, the digital decision index is an electronic tool that acts like a framework, or interactive decision tree, that guides a user through a decision making process, providing outcomes and additional relevant information.
Students were asked to design an index that would help them select the right statistical test for a given situation, and to interpret the results. They could build it in any electronic format that worked for them – software like Excel, Canva, Twine and Lucidchart were used. The aim was for students to create a resource they could use after completing the unit, to assist them when they began a new research project (perhaps in another unit).
For the assessment, students presented their index in an online class and discussed how they developed it. I then asked the student to use their index to help chose, run and interpret a statistical test to address a research question, and supplied a database. Students were each given different questions and did not see the questions beforehand. Marks were given for the creation and effectiveness of the digital decision index, along with clarity and efficiency.
I was struck by the wide variety of digital decision indexes created by the students. Some used a flow chart style while others developed a linked database where answers to questions add up behind the scenes to produce a recommendation at the end. Students often made use of hyperlinks to link to additional information in an efficient way.

Examples of different types of digital decision indices
The preliminary feedback obtained via anonymous, optional Qualtrics survey (n=18, 2024-25) has been positive. The majority of students reported:
“I liked using the tool, it was nice not to have another written assessment. I thought the process of putting together the tool was helpful in organising my thoughts, and felt like a more realistic application of stats than an exam would be.”
“It was much more interactive than it would have been if we had a test and we would have just read the text book to study. This made us apply it practically while creating the tool which was then us "studying" for the assessment”
“I had very little understanding about stats initially. While the assignment seemed impossible to begin with, it really made me learn about and understand different tests, even just "simple" things like understanding different types of data which I could not get my head around at the beginning.”
“It was very stressful considering there was limited time to do the assessment, but once I upskilled with LucidChart and came up with the idea of putting SPSS commands into my hyperlinks, it was extremely quick, easy and efficient to use. Stress well worth it.”
About half the students did use AI – mainly for troubleshooting, selecting or learning new software, and clarifying statistical principles. Importantly, most felt AI couldn’t replace their own thinking in this task.
“I asked AI for ideas on the easiest software to use for this task”
“I had ZERO understanding before, so this really helped cement my learning. Also, now I have the tool to use in future.”
“Not having heard of the concept, nor any of my friends who work in programming/IT understanding what it was and the internet not being very useful in defining/explaining what it was either.”
“Structure and design was my biggest concern, but knowing it could be any format that makes sense to us helped with getting started.”
“It was an extremely time consuming assignment- when you weigh up the benefits x work in putted unfortunately I don't think the benefits outweighed the effort”
Overall satisfaction with this unit improved by 12.5% compared to the previous year, when the assessment was an online exam, as assessed by the university’s Unit Teaching and Evaluation Instrument (UTEI) survey.
Based on feedback and reflection, there are a few key aspects to help make this assessment work well.
When I shared this work at a HERDSA conference roundtable in July 2025, delegates raised some additional helpful points:
These conversations reinforced that authentic assessment is about helping students engage meaningfully with the learning process.
Although concerns about academic integrity are pushing universities to move away from online exams, the value of online learning cannot be ignored. There is a need for more authentic, engaging online assessment like the digital decision index that is process driven and resistant to academic misconduct.
The HERDSA Connect Blog offers comment and discussion on higher education issues; provides information about relevant publications, programs and research and celebrates the achievements of our HERDSA members.
HERDSA Connect links members of the HERDSA community in Australasia and beyond by sharing branch activities, member perspectives and achievements, book reviews, comments on contemporary issues in higher education, and conference reflections.
Members are encouraged to respond to articles and engage in ongoing discussion relevant to higher education and aligned to HERDSA’s values and mission. Contact Daniel Andrews Daniel.Andrews@herdsa.org.au to propose a blog post for the HERDSA Connect blog.
HERDSA members can login to comment and subscribe.