Study Finds Unique Strategies to Improve Student Performance on Physics Exams
UCLA and UC San Diego researchers identify promising interventions to address inequities in academic performance including AI-generated hints
Story by:
Media contact:
Published Date
Story by:
Media contact:
Share This:
Article Content
A team of researchers from the University of California Los Angeles and the University of California San Diego have discovered positive uses for AI and extra credit assignments to help improve student performance in introductory physics courses, including reducing the impacts of inequities, as featured in the June 12, 2025 issue of the Physical Review Physics Education Research Journal.
This study is one of the first to explore the impact on exam performance of offering large language model (LLM)-based AI-generated hints during homework assignments for an undergraduate physics course.
The study also addressed inequities in student access to foundational mathematics, such as trigonometry and calculus. Black, Hispanic/Latine, Native American students, and students from low-income backgrounds are less likely to have had the opportunity to take advanced math courses in high school, putting them at a disadvantage for successful academic performance in college physics.
To mitigate these disparities, the researchers implemented a two-pronged intervention:
- Incentivized Supplemental Math Assignments: Extra credit was scaled to provide greater value to students with lower exam scores, encouraging participation and reinforcing foundational skills.
- AI-Generated Learning Support Tools: Optional hints embedded within physics homework assignments offered personalized assistance, reducing barriers to help-seeking and promoting independent learning.
Both interventions were grounded in the situated expectancy-value theory of achievement motivation: a theory that hypothesizes that students are more likely to engage in tasks they expect to succeed in and perceive as valuable.
Most of the students explored the AI-generated hints feature. When available, 88% of the students used AI-generated hints on at least one homework question (among the problems that were found to match the exam questions), and 57% of the students used AI-generated hints on at least five questions. On average, students used AI-generated hints on at least 20% of the questions.
The study’s findings indicate that these interventions are associated with increased exam scores and reduced disparities in completion rates of supplemental math assignments. Notably, the scaled extra credit system was particularly effective for students who might benefit most from additional support.
Yifan Lu, the first author and a doctoral student in the Department of Physics and Astronomy at UCLA, said:
“By implementing scalable and low-cost interventions, we can create more equitable learning environments. Using AI tools can remove social barriers that prevent students from asking instructors, teaching assistants, and/or peers for help.”
Elizabeth H. Simmons, study co-author and Executive Vice Chancellor and Distinguished Professor in the Department of Physics at UC San Diego, added:
“Our findings demonstrate that thoughtful integration of technology and targeted incentives can make a meaningful difference in student academic performance. These strategies are effective and feasible for widespread adoption in a wide range of undergraduate courses that include mathematical topics.”
Based on the findings of this study, several introductory physics courses taught at UCLA have already incorporated supplemental math assignments and AI-generated hints, with plans for wider implementation underway.
“In the next year, we are working with faculty members in the Chemistry and Life Science core departments who will adapt the supplemental math to their context and implement our approach in their courses,” said K. Supriya, Associate Director of Assessment at the UCLA Center for Education Innovation and Learning in the Sciences.
The study’s findings underscore an effective use of AI in supporting improved learning outcomes. For future studies, the research team recommended exploring the factors that shape students’ decisions to use or not use the optional supports offered.
This collaborative effort highlights the commitment of both institutions to advancing effective instructional practices and finding innovative ways to foster inclusive learning environments across all disciplines, including STEM.
Full list of authors: Yifan Lu, UCLA; K. Supriya, UCLA; Shanna Shaked, UCLA; Elizabeth H. Simmons, UC San Diego; and Alexander Kusenko, UCLA.
The online learning platform Kudu, developed by Warren Essey and Alexander Kusenko of UCLA, powered the AI tools used in this study.
Funding for this research was provided by the UCLA Teaching and Learning Center and by the UCLA Department of Physics and Astronomy.
Share This:
Stay in the Know
Keep up with all the latest from UC San Diego. Subscribe to the newsletter today.