Research

The picture was generated by Dall-e

My research primarily focuses on four areas: student and response behavior modeling in assessments and learning environments, automated item quality evaluation, automated scoring and feedback, and ethical considerations of artificial intelligence in education. While working on these areas, I use the following methods:

  • Educational data mining
  • Machine learning
  • Natural language processing
  • Learning analytics
  • Psychometrics and quantitative methods

Ongoing Projects

Evaluating the quality of automatically generated items using large-language models

PI: Guher Gorgun, University of Alberta (Dissertation)

The purpose of this project is to develop an automated evaluation tool for automatically generated items leveraging large-language models. Our goal is to enhance the item quality evaluation process and make the evaluation process more scalable and efficient.

Developing a human-centred automated scoring and feedback system for higher education instructors

PI: Guher Gorgun, University of Alberta

Funded: Alberta Innovates Graduate Student Scholarship

The purpose of this project is to develop an automated scoring and feedback system with student and instructor input. We conduct qualitative interviews with instructors and students to develop a feasible and efficient scoring and feedback system to be used by higher education instructors.

Identifying the predictors of mathematics anxiety and performance in Canada: An Educational Data Mining Approach

PI: Dr. Okan Bulut, University of Alberta

We aim to mine the international large-scale assessment data (i.e., PISA and TIMSS) to identify the predictors of math anxiety and math performance to develop actionable insights and recommendations for tackling math anxiety and boosting math performance for students in the province of Alberta, Canada.

A systematic literature review of ethical issues of artificial intelligence in education

The purpose of this project is to conduct a systematic literature review to identify ethical issues and concerns about artificial intelligence (AI) in education to develop an AI ethics framework for researchers using artificial intelligence methods in education.

Standardization of Alberta Education Reading Assessment

PI: Dr. George K. Georgiou, University of Alberta

The purpose of this project is to develop reading norms for Grade 1, Grade 2, and Grade 3 students in English, Francophone, and French Immersion schools in Alberta, Canada.

Identifying common math misconceptions in ASSISTments

Advisor & Collaborator: Dr. Anthony Botelho, University of Florida

The aim of this study is to use machine learning and natural language processing (e.g., SBERT) approaches to identify common math misconceptions in an intelligent tutoring system.

Completed Projects

Identifying aberrant response behaviors using clickstream data among students with significant cognitive disabilities (2022/01 – 2022/04)

PI: Guher Gorgun, University of Alberta

Funded: ATLAS Doctoral Research Fellowship

This research project aimed to explore aberrant responses in an alternate assessment administered to students with significant cognitive disabilities by analyzing clickstream and process data with the longest common subsequence method. The testing setting was unique, allowing us to employ innovative research approaches for identifying aberrant response behaviors among students with significant cognitive disabilities.

Developing students’ perceptions of teaching (SPOT) survey (2022/07 – 2022/09)

PIs: Drs. Okan Bulut, University of Alberta & Lia Daniels, University of Alberta

The aim of this project was to develop and validate a course evaluation survey. We conducted semi-structured interviews with instructors and students and used psychometric approaches to collect validity evidence for the survey use and interpretation.

Predicting cognitive engagement in online course discussion forms (2021/09 – 2022/02)

PI: Guher Gorgun, University of Alberta

Advisor: Carrie Demmans Epp, University of Alberta

We labelled the online discussion forum posts based on the level of cognitive engagement scheme we created, then developed an automated detection system using machine learning and natural language processing approaches.

We presented our study at the annual meeting of the Educational Data Mining Conference (2022) in Durham, UK and our paper is published in the conference proceedings.

The role of formative assessments for learning analytics (2021/04 – 2021/06)

PI: Dr. Okan Bulut, University of Alberta

This project analyzed the role of formative assessments in predicting students’ course performance using large-scale data from a university course.

We developed an empirical research paper using the findings of this project. The findings were also presented at the International Conference on Learning Analytics and Knowledge (LAK2022).

Measuring students’ socio-emotional well-being to optimize learning: Looking beyond academic performance and grades (2019/09 – 2020/12)

PIs: Drs. Okan Bulut, University of Alberta & Man-Wai Chu, University of Calgary

This research project developed a socio-emotional well-being scale. By gathering empirical evidence and psychometric properties of the scale, we collected validity evidence for scale use and interpretation.

We created a technical report and an empirical research paper based on the findings of this study. The study results were also presented at national and international conferences including the annual meeting of the Canadian Society for the Study of Education.

Assessment literacy module development (2021/09 – 2021/12)

PI: Dr. Okan Bulut, University of Alberta

The purpose of this project was to develop six modules on assessment and feedback development for higher education instructors.

Collapsing scale categories (2016/09 – 2019/05)

PI: Dr. Kimberly F. Colvin, University at Albany, SUNY

This project aimed to analyze and compare the psychometric properties of a scale when administered with a different number of rating scales and the influence of collapsing scale categories on psychometric properties.

We developed two empirical research papers based on this project: the first one was about the psychometric properties (e.g., validity) of scales with different rating scales and the second one was about the interpretation of scale categories when they were collapsed. The findings were also presented at the annual meeting of the National Council on Measurement in Education (NCME) conference.

Instructors’ and students’ reactions to high-stakes testing (2018/05 – 2018/09)

PIs: Drs. Julie Learned, University at Albany, SUNY & Kathryn S. Schiller, University at Albany, SUNY

The purpose of this research was to analyze adolescents’ and educators’ perceptions, reactions, and resistance to high-stakes testing.

We developed an empirical research paper based on this study. The findings were also presented at the annual meeting of the American Educational Research Association (AERA) conference.