Designing for the uncertain: Rethinking secure assessment in creative disciplines

You are here

One of the enduring paradoxes of education is that learning is both observable and elusive, especially in the creative disciplines. We want students to master technique, yet also to cultivate something less tangible: a taste, an intuition, an artistic judgement that develops through practice and lived experience. Assessment is our attempt to make this visible.

In the performing arts, real-time assessments are being valued anew – the physical act of acoustic performance offering a rare form of integrity, seemingly beyond technological mediation. On the other hand, the polished recording can leave us wondering whether the student iterated and made creative choices, or whether the work was heavily mediated by software. These questions pre-date AI; contemporary tools merely intensify them by compressing labour and obscuring provenance.

Across Australian higher education, the Tertiary Education Quality and Standards Agency (TEQSA) has asked every provider to develop credible, institution-wide responses to AI that emphasise assessment reform, academic integrity and transparent tool use. It is within this national context that we, educators with backgrounds in Drama, Music and Film, take up the shared tensions and dilemmas of secure assessment in the creative disciplines.

Subjective success metrics in creative disciplines

We can't address such tensions and dilemmas without first asking: what, exactly, are we securing in creative assessments?  In creative practice, metrics of success can be slippery. Even when we focus on securing provenance, questions remain about what counts as a successful outcome for diverse cohorts: is this genre‑specific, tied to professional standards, guided by artistic aspirations, or something else?

For example, a music performance assessment rubric needs to balance consideration of technical accuracy with evaluation of the student’s demonstration of stylistic expression. A technically flawless song might sound lifeless to some, while a rough demo can move an audience to tears. Is a dramatic staging of a play successful because it achieves textual fidelity and technical control, because it communicates an embodied understanding of character in context, or because it provokes surprise?  The recent turn towards historically informed performance (HIP) shows how artistic judgement itself evolves: approaches to Baroque music have shifted dramatically over the past few decades, inviting students to develop historically sensitive inquiry across all repertoire.

Rubrics often try to capture the breadth of creative success through categories such as craft, creativity and presentation, but none of these are neutral concepts. They carry assumptions and biases about what we (coordinators, tutors, assessors) value, which differ across genres, traditions and assessor identities. This is where traditional attempts to secure assessment, via live, invigilated conditions, shows its limits. It can assure us that students did the work, but it is hostage to the assessors’ subjectivity in situ.

Recognising this does not invalidate assessment of creativity. The challenge is to design ways for students to make their own criteria explicit: to justify why a chord works, why a beat change lands, why a staging choice sustains the given circumstances; to set their terms and convince assessors that they met them. In doing so, creative students reveal not just what they know, but also their location within a discipline’s evolving traditions.

AI as provocation

Generative AI is not the first technology to unsettle the question of provenance. For years, software such as GarageBand, Sibelius or Logic Pro have blurred the boundaries of authorship in Music; scriptwriting and editing tools like Final Draft, Celtx and Premiere Pro have done the same in Drama. What AI adds is a much more powerful and opaque technological assistance, including the ability to convert ideas conveyed through written language into other media. AI therefore carries the potential to allow creators to forgo creative instruments (be they physical or digital), to compress labour and to absorb process. In creative assessments, AI outputs are not only convincing imitations, they also pose awkward questions about our longstanding metrics of value. If a machine can compose music that moves us from a written paragraph, then what is it that we ask of students?

The provocation of AI, perhaps, lies in its refusal to play by the constraints of our frames of reference. We are trained to and accustomed to distinguish between technical competence and creative originality, between process and product, between the learnt and the innate. AI collapses these distinctions by presenting artefacts that appear both technically adequate and stylistically coherent, but with minimal human labour behind them. The work arrives with —the argument goes— insufficient struggle, without the sediment of lived experience. What AI throws back at us is the question: do we value the artefact, or the process of becoming capable of creating it? It continues to force a choice that we are largely unprepared to make.

Notwithstanding this, secure assessment cannot be about excluding AI, because integrity does not reside in quarantine from reality. Rather, the presence of AI shifts the meaning of “security” itself; away from the guarantee of provenance in which the student alone must press every key or deliver every line, and towards assurance that they inhabit a relationship to the work. That is, whether they can reflect upon process, defend the work, critique it, and situate it within a broader world of meanings defined by the artist.

Towards a different kind of certainty

Perhaps the paradox of security in creative assessment resists resolution because creativity is evolving and unfolding. Intuition, taste and artistic judgement are not static boxes to be ticked off against a rubric; they are capacities shaped by practice, exposure and reflection. What matters in education is not to pin them down, but to help students develop a discernment that will continue to deepen after the grade. In Music Teacher Education this includes reflective capacity to make and justify pedagogical choices about repertoire, arrangement, scaffolds, and rehearsal strategies.  In Drama Education, likewise, the emphasis shifts from performance quality to dramaturgy and rehearsal, where process rather than product accrues artistic achievement.

One approach may be to identify finite elements of a creative or design process that can be observed securely in a time‑limited environment. A student cannot demonstrate their entire compositional journey in an hour, but they can be asked to take a given motif and develop it into a short piece that shows both coherence and variation. A Drama unit might pair a rapid in‑class scene study with a rehearsal portfolio and an interactive oral. These slices of process do not capture the whole, but they assure us that essential capacities are present. What emerges is less a single solution than a menu of secure possibilities.

The strength of such a menu is that it allows disciplines to tailor secure assessments to their own traditions, while remaining consistent with TEQSA’s sector guidance. Unlike a traffic light approach which tends to classify tasks in a fixed and binary way, a menu approach acknowledges grey areas, overlaps and moments of ambiguity that call for creativity and authentic design thinking. By combining in-the-moment secure tasks with reflective work during and after the assessment, it supports both the establishment of creative integrity and the development of self-regulated learning. This way, educators can design combinations that surface judgement-in-action and make any permitted AI use explicit through method disclosure and defence.

Call to action

If secure assessment of creativity is to mean anything in the age of AI, it must secure students’ emerging professional sensibility: their capacity to work intimately and skilfully with the tools at hand while safeguarding the integrity of their own agency, mind and artistic judgement.

The invitation, then, is practical: to design a menu of assessment components that capture finite elements of creative process, while pairing them with reflective tasks that reveal judgement and integrity across disciplines.

We contend that it is the kind of certainty worth designing for.

 

Blog Contributors

 

References & further resources

 

Banner image: Generated by ChatGPT


The HERDSA Connect Blog offers comment and discussion on higher education issues; provides information about relevant publications, programs and research and celebrates the achievements of our HERDSA members.

 

HERDSA Connect links members of the HERDSA community in Australasia and beyond by sharing branch activities, member perspectives and achievements, book reviews, comments on contemporary issues in higher education, and conference reflections.

 

Members are encouraged to respond to articles and engage in ongoing discussion relevant to higher education and aligned to HERDSA’s values and mission. Contact Daniel Andrews Daniel.Andrews@herdsa.org.au to propose a blog post for the HERDSA Connect blog.

 

HERDSA members can login to comment and subscribe.