The University of the Sunshine Coast (UniSC) is building a culture of integrity in the responsible and ethical use of artificial intelligence (AI) across its campuses, reaffirming its commitment to fair access to education, innovation and student success.
UniSC is embracing the transformative potential of AI in learning and teaching, while steadfastly upholding ethical principles.
Through the integration of AI-powered platforms, lecturers are developing personalised learning experiences, identifying students who may show lack of engagement or understanding, and provide timely support.
"AI, when used responsibly, can enable more inclusive and engaging learning environments," UniSC Deputy Vice-Chancellor (Academic), Professor Michael Wilmore said.
"At UniSC, we're committed to making sure our students develop both digital literacy and ethical awareness so they are equipped for a future where AI will be ever-present."
As AI tools like generative text models become more sophisticated, UniSC has implemented robust guidelines and proactive measures to ensure assessments remain fair and meaningful.
All assessment practices are continually reviewed to ensure students' work is genuinely their own. Assessment design now increasingly incorporates oral components, project-based tasks, and reflective writing that require critical thinking and personal engagement - tasks less easily completed by AI alone.
"Responsible use of AI in assessment means more than just detection - it's connected to our core UniSC value of integrity - educating both staff and students on ethical boundaries, transparency, and academic honesty," Professor Wilmore explained.
Workshops are provided by library and student services, and online resources are available to support just in time assistance.
"We focus on assessment design that values evidence of learning, process and creativity, making it clear where Gen AI can legitimately assist, and where its use would constitute misconduct."
UniSC's Academic Integrity Matters employs a suite of methods to identify when generative AI has been used inappropriately in assessment tasks. This includes advanced plagiarism detection software, forensic linguistic analysis, and prompt-based comparison techniques.
For instance, when an assignment is suspected to be generated by AI, markers may check for inconsistencies in language, lack of personal reflection, time spent on activities, or improbable citation patterns.
If concerns arise, UniSC follows a fair and transparent process. Students are given the opportunity to explain and provide evidence of their authorship.
Where misconduct is proven, outcomes are tailored to the nature and severity of the breach, from education around appropriate use, to formal sanctions in line with university policy.
"Maintaining trust in our qualifications means we must act swiftly and fairly when academic integrity is at stake," said Professor Wilmore.
"Our processes ensure that each case is reviewed individually, with a strong focus on procedural fairness and educational outcomes."
UniSC continues to invest in professional development for staff, ensuring they are equipped to design robust assessments and engage constructively with new AI technologies.
The university is also consulting with students as partners, raising awareness of AI's capabilities, risks and ethical implications.
"Our vision is for UniSC graduates to be leaders in the ethical use of technology, not just in their studies but throughout their professional lives," Professor Wilmore said.
"By setting high standards now, we're building a culture of integrity and innovation for the future."