2024-05-20 Educators WG: AI Powered Assessment
Recording:
Video: 5 Lessons from AI-recording.mp4
Transcript: GMT20240520-165752_Recording.transcript.vtt
Assets:
Open Source Templates
Assistant Template - For phased interactions with an AI that involve user input, AI feedback, and optional AI scoring. Can build things like case study reviews, writing feedback, AI debates, etc.
Completion Template - For one-off interactions with an AI, like generating MCQs
Demo Apps
Guided Case Study : A case study review where students review a case study, critically reflect on it, and receive AI-generated feedback that is guided by what the faculty thinks is important about the case study.
Writing Feedback : A writing exercise where a student drafts an introduction paragraph for a grant application. Again, they receive AI-generated feedback that is guided by what the faculty thinks is important to include in their writing. They then have a chance to revise their original draft.
AI Debate : A student has a chance to engage in a debate with AI (in this case, about Digital Health in Medicine as the topic). The AI is guided to debate the student for two rounds and then summarize lessons learned and good points made by the student.
MCQ Wizard : Generate MCQ questions with optional feedback and hinting based on faculty requirements.
Minutes:
00:15 - 02:07 Welcome and Introduction
John Swope, Education Technology Specialist, Chair at St. George's University AI in Higher Education committee, Author of Micro AI Apps in Online Education: Impacts on Efficiency, Quality and Future Directions to discuss 5 Lessons Learned Building AI Assessments.
02:07 - 40: 35 Presentation
AI-Powered Assessment
Sample video case study
5 Lessons:
AI costing on each AI version
AI strengths and weaknesses
AI Scoring
Prompt sequencing
AI rapid evolvement
Other Thoughts
AI Assessments should be faculty guided
AI Assessments should be clearly disclosed
AI Assessments should be elective
Student’s literacy on AI if AI gets confused
40:36 - 56:41 Q&A
Do you have any sites for demos?
Yes (@John Swope will share).
If people are developing a curriculum with copyright ownership or privacy issues how they might go about using AI?
My personal opinion is that AI is too big to fail. There is no legislation that's going to come in and kill the momentum of AI at this point it's too big. The worst is that they have to pay some money or something.
Everyone who uses AI is responsible for the outputs. Anyone who takes AI's output verbatim is going to run into problems and our students should learn that as well.
What do you think of students essentially using some sort of AI, whether that's co-pilot to get assessment feedback for themselves? What are the pros and cons of the rubric not built by the instructor?
Pro/s - Student’s AI literacy is an important skill going forward.
Cons - Wether the students are getting the right data from using AI, or if the students are using AI itself in a right manner.
Have you compared any published literature, comparing results generated using the kinds of approaches that you shared versus a multi-agent workflow?
No. There is no affinity on what I want to work with and hopefully, I can get there.
Do you have lessons learned in video-based assessments?
If you are talking about a scenario where you record some sort of interaction, no, that's part of the promise of 4 omni where it can actually understand a video, so that is theoretically something we're getting closer to.
Cost comparison between using GPT assistance versus API.
The assistant API has the potential to have cost overruns and to get out of hand quicker because you have this conversation that's happening. The AI needs to keep track of the context of that conversation. You can feed data into the Assistant API completion. API is a lot simpler, and the costs are a little bit easier to understand and manage.