Actionable Ideas for working with Artificial Intelligence
Artificial Intelligence (AI) is the talk of the sector, and suggestions about how it could be used in learning, teaching and assessment abound. But how to take the leap and start translating ideas into action? We asked the Artificial Intelligence Community of Practice how they’ve been working with AI this semester.
AI ideas in action
Dr Kathryn Bowd in Media asks her students to critique a piece of journalism written using AI, and then asks them to use the same sources to produce their own version of the story. This provides an opportunity for them to consider the similarities and differences between what AI can do, and the way they chose to frame the story.
In the school of Education, Dr Daniel Lee’s students prepare for their Academic Literacy class by asking Chat PDF to generate discussion questions based on the course readings.
Dr Cheryl Pope’s Computer Science students compare their data structure designs to those generated by Chat GPT and evaluate the accuracy and precision of the AI’s answers. They reflect on the best design as a way of evaluating approaches and testing the capabilities and limitations of Chat GPT.
In Architecture, Dr Amit Srivastava is excited by the way that AI image generators can enable students to experiment with a vast range of design effects in a short space of time, allowing students to pursue their creativity in new ways. An iterative assessment asks students to justify their design choices at each step of the drafting process.
Dr Eleanor Parker and Dr Dandara Haag from Adelaide Dental School tested their fifth year students’ accumulated knowledge by asking them to critique Chat GPT’s response to a complex scenario relating to population health. Students drew on their learnings across the last 4 years to critique the AI-generated response using critical thinking, analytical skills and the most up-to date scientific evidence. This course also uses Cadmus to track student engagement with the course materials and assessments.
What do students think?
Here are some examples of students’ reactions to the opportunity to critique Chat GPT in their dental assessment (shared with permission).
When I look at AI and machine learning today, I imagine that this amazement is what my parents felt when electronic mail became widespread. However, technology has much catching up to do if it seeks to grapple with complex sociological issues such as this.
“Overall, the ChatGPT response failed to provide an in-depth analysis of the factors .... The lack of depth and breadth in its arguments highlighted underlying flaws within artificial intelligence models and their limited capacity to provide details answers on epidemiological topics by referencing the latest data. Thus, students must do their due diligence when using such platforms for research.”
“The AI-generated response, lacks references and information to back-up the statement, creating gaps in the reliability and validity of the information provided. There are no references to back up claims to the statement ..... The written response (ChatGPT) was easy to understand, although, there are certain aspects that require clarification. Overall, the response requires scientific evidence to back up claims, and the AI technology progression holds potential for better and well-supported responses, incorporating cultural safety in the future”.
“I personally felt that this was a really interesting assignment and it brought out a competitive spirit within me in trying to best an AI in knowledge and quality of content. It did take me longer than expected, but I realised it helped me evaluate what I was reading and also actually understand the issues better. During that time, I had also just started playing around with chatGPT and asked it to condense dental knowledge and was totally amazed at the speed it could do everything, but was not being critical of it AT ALL and just took everything at face value, so having finished the essay, it gave me a new angle to look at AI. As a student, just want to say that this was a really well set assignment!”
Keeping academic integrity in mind
Over in the Business School, Learning Designer and teacher John Murphy is thinking about academic integrity. For the first few weeks of semester, John’s students use the last 10-15 minutes of tutorials to send him an email about something that they found interesting or challenging, something they are curious about, or a current real-world example that relates to that week’s topic. These are not graded, but John addresses them in the next tutorial, and uses them as a benchmark if any academic integrity concerns arise in assignments. They also help him get to know his students better.
Dr Tsan-Huang Tsai has also considered academic integrity in the design of problem-based questions in his Music course. Tsan used a timed assignment in Cadmus and asks students to respond to a range of music clips which relate directly to the course materials.
Dr Amy Milka, Academic Integrity Manager is (unsurprisingly!) also thinking about academic integrity and artificial intelligence going into semester 2. Amy advises using the University’s Student AI Guide to start a conversation about academic integrity, expectations for AI use in your course, and discuss the efficiencies and drawbacks of using AI in your discipline. Embed a link to the Library’s Guide to Citing and Referencing AI or a video about academic integrity and artificial intelligence. You could even collaborate with on some ‘rules of engagement’ with AI in your assessments. If you’ think you’ve found an academic integrity issue, The Academic Integrity team also has some advice on how to gather evidence.
Consider Access and Equity
If you are asking students to use AI in your course, ensure that all students are able to access a free tool. You should also consider the Generative AI Security Guidelines.
Looking for more advice about AI? Head to the AI and Learning webpage for more resources and articles.