KEYNOTE SPEAKERS

Interview with Prof. Christopher DeLuca

Prof. Christopher DeLuca

Queen’s University

Title: From Artificial Intelligence to Assessment Innovation: Toward a More Human Future for Assessment Research and Practice

 

Abstract: This keynote address considers some of the most pressing challenges before us –– the rapid onset and evolution of artificial intelligence, the global sustainability crisis, and the paramount need to restore health and wellbeing in our societies –– as incitement for assessment innovations. Drawing on studies of teachers’ innovative assessment practices, this talk envisions a brighter, more human future for assessment in schools and classrooms.

Bio: Dr. Christopher DeLuca is an Associate Dean at the School of Graduate Studies and Postdoctoral Affairs and Professor in Educational Assessment at the Faculty of Education, Queen’s University. Dr. DeLuca leads the Classroom Assessment Research Team and is Director of the Queen’s Assessment and Evaluation Group

Dr. Yiasemina Karagiorgi

Cyprus Ministry of Education, Sports and Youth

Title: The Implementation of International and Local Assessment Programmes in Cyprus: Reflecting on the Experiences behind and the Challenges ahead

 

Abstract: Traditional student testing at the national level has been challenged by the evolution of technology and its impact on educational assessment. This presentation aims to reflect on the developments of international and national large-scale assessments, implemented in Cyprus over the last 10 years by the Center of Educational Research and Evaluation of the Cyprus Pedagogical Institute. These assessment practices certainly differ in terms of intended outcomes and testing processes. The extent and the ways in which these programmes have changed as a response to advancements in assessment worldwide have also been different; while some programmes still reflect what is known as ‘traditional testing’, others have moved along to integrate e.g. computerised adaptive testing and process data. The presentation aims to highlight lessons learned from past experiences and draw on the way forward.

Bio: Dr Yiasemina Karagiorgi [karagiorgi.y@cyearn.pi.ac.cy] is the Head of Educational Research and Evaluation (CERE) of the Cyprus Ministry of Education, Sports and Youth since September 2011. As the Head of the CERE, she is currently supervising the implementation of local national research programmes (e.g. programme for functional literacy) and evaluation studies on programmes and innovations of the Ministry of Education. She is also the national project manager for Cyprus for international surveys, such as TALIS, PISA, TIMSS, ICCS, ICILS, PIRLS, HSBC. She has participated in several funded European projects and has published her work in international and local educational journals.

Heather Kayton

University of Oxford

Winner of the Kathleen Tattersall New Researcher Award

Title: Evaluating the validity and comparability of PIRLS 2016 in South Africa

 

Abstract: Large-scale assessments of reading, such as the Progress in International Reading Literacy Study (PIRLS), are useful for understanding reading development. PIRLS aims to provide valid, reliable, and fair assessments of reading across multiple countries. However, the extreme range in reading achievement outcomes, from the very high performance of some countries to the very low performance of others, creates substantial disparities across countries that threaten the validity, reliability, and fairness of PIRLS results. South Africa participates in PIRLS to better understand and inform interventions that can help improve reading in the country. But consistently low performance in PIRLS, coupled with intricate multilingualism, historical racial discrimination, and extreme inequality present further challenges to the already complicated task of assessing reading.

Bio: Heather Leigh Kayton is a Senior Education Specialist at What Works Hub for Global Education at the Blavatnik School of Government. Her work focuses on improving educational outcomes in developing countries, with a particular focus on the relationships between diverse, multilingual classrooms and foundational literacy development.

Heather has over a decade of experience as a teacher and teacher trainer in South African schools. This inspired her passion for finding pragmatic, equitable solutions to ensure all students learn to read successfully. Heather holds a MA in Applied Linguistics from the University of Johannesburg and a DPhil in Education at the University of Oxford.

Prof. Joshua McGrane

University of Melbourne

Title: Educational assessment and Generative AI: Less talk, more evidence

Abstract: The rapid emergence of AI, particularly through large language models (LLMs) and other generative AI tools, marks a profound disruption to education — arguably the most significant in decades. Initial responses from educational institutions often involved bans on AI for both teachers and students, at all levels of education, with fears about cheating, misinformation, and bias driving much of the conversation. While crucial, this ethical discourse has overshadowed an equally urgent question: How can we use these tools to enhance educational assessment? With bans being lifted, many institutions now offer little guidance beyond vague directives to “use responsibly” and “fact-check.” In this keynote, I will present two years of collaborative research that directly tackles this challenge, emphasising evidence-based strategies for integrating AI into assessment practices. The talk will explore key areas where AI can potentially add value to educational assessment while maintaining ethical rigour, and will propose critical research directions for the future. The aim is not just to debate AI’s place in educational assessment, but to provide practical, actionable insights into how we can leverage these tools to improve assessment outcomes for learners and educators alike. It’s time to shift from discussion to evidence-based implementation.

Bio: Joshua McGrane is Associate Professor of Measurement Analytics and Deputy Director of the Assessment and Evaluation Research Centre at the University of Melbourne. He is an Executive Editor for the journal, Assessment in Education: Principles, Policy and Practice. He has been an expert advisor to AQA, Qualifications Wales, and the International Baccalaureate. His research cuts across research disciplines and paradigms with a specific focus on the philosophy of measurement, Rasch modelling, as well as the use of comparative judgement in educational assessment. Most recently, he has been working with colleagues to evaluate the ethical, conceptual and empirical implications of the new wave of AI for educational assessment and measurement.