University of Exeter logo
EduExe logo

EduExe blog

Home About Contact Toggle navigation Open menu

Assessed Workshops : AI Integrated learning

15 May 2025

3 minutes to read

Assessed Workshops : AI Integrated learning

As concerns about student use of AI in assessments grow, a knee-jerk reaction to increase in person exams may lead to unsustainable practices.  In addition, we know that exams are not an inclusive or accessible way to assess knowledge in one single assessment. In this blog, we report on an alternative approach of using timetabled slots to focus student learning through assessment, and ask whether assessed workshops might offer a potential solution to concerns regarding use of AI in assessments.

Pedagogical Framework and Workshop Design

With Elkington’s (2020) framework in mind, I developed an innovative approach for the Database Technologies for Business Analytics module, combining feedback mechanisms and peer review for second-year BSc Business Analytics students. Three workshops were designed to assess the module’s Intended Learning Outcomes (ILOs). These workshops emulated realistic scenarios where students applied their knowledge. Unlike traditional exams, these workshops encouraged dialogue and collaboration, simulating a work environment.

Implementation

Workshop activities were communicated at the start of the term. Discussions held with students with Individual Learning Plans (ILPs) ensured they had the necessary support.

Students were guided on the topics for each of the workshops with differentiated reading lists and outlines of concepts. For example, assessed workshop 1 was to assess knowledge of how to extract entities (things of importance in the scenario), how to generate, evaluate and interpret an entity relationship diagram (ERD) and how to apply knowledge of data integrity to a given SQL (database language) output.

A practice workshop was conducted to familiarize students with the process and expectations, including constructive peer review. Direction was provided on the expectations of the peer review, that of being constructive, supportive and providing specific actionable feedback.

Each student received a unique scenario and code via mail-merge on Friday, with the workshop on Monday. This allowed one working day for review and reflection. Some students found themselves unable to attend the workshop physically and so arrangements were made with them to connect in real time but remotely.

Workshop Execution

The ELE2 (Moodle) Workshop component facilitated the submission and peer review process. Figure 1 shows the different phases and setup.

Table showing the process on ELE for setting up the peer activity - summary of key information.

Setup phase: Set the workshop description; Provide instructions for submission; Edit assessment form

Submission phase: Provide instructions for assessment; Submit your work; Allocate submissions; 

Assessment phase: Assess peers

Grading evaluation: Calculate submission grades; Calculate assessment grades; Provide a conclusion of the activity

In addition, students were provided with a template for each of the workshops to guide them on the appropriate level of input. For example, students were to use GenAI tools to generate their ERD, to enter the URL of the tool they used, a copy of the ERD generated and then write a couple of paragraphs to reflect on the use of the tool. For identifying the errors, they were given a table to complete to illustrate what the errors were and where they were found.

During the workshop students had 50 minutes to prepare a submission for review based on the tasks and the template. A 10-minute break was followed by 50 minutes for the peer review. At the end of 50 minutes the workshop was closed and the students had 48 hours in which to reflect on feedback gained from the workshop and to write that up. They could adjust their answers based on the feedback should they wish.

Marking

Students were marked on the feedback they gave, rather than the feedback they received. Students were asked to review the marking criteria prior to each of the workshops and were given an opportunity to clarify what the marking criteria meant. The first workshop included discussion of the criteria, whereas the later workshops did not generate any further questions.

Outcomes

Students reported feeling less stressed and appreciated the incorporation of feedback. They valued that others’ peer review of their work did not affect their grades directly.

AI tools were sanctioned, with students required to evaluate and reflect on AI-generated outputs.

Conclusion

This approach to assessed workshops fostered a supportive learning environment, encouraging autonomous learning and practical application of knowledge. Future iterations will continue to refine this method, ensuring it remains effective and sustainable.

Elkington, S. (2020). Essential frameworks for enhancing student success: Transforming Assessment in Higher Education – A Guide. (Student Success Global Framework Series ). Advance HE. https://www.advance-he.ac.uk/knowledge-hub/essential-frameworks-enhancing-student-success-transforming-assessment

Share


For more information please contact:

This post was written by Dr Shirley Atkinson, Faculty Director for Data Science and AI Education, Director of Education in Operations and Analytics, Senior Lecturer

Contributors

Back home
TOP