60-Second SoTL

Knowledge Surveys for Student Self-Assessment of Learning

Episode Summary

How can knowledge surveys strengthen students’ self-assessment and metacognition? This episode highlights a multi-semester study from the U.S. Air Force Academy on using knowledge surveys to help students better gauge their understanding and direct their learning.

Episode Notes

Access our full episode notes at https://www.centerforengagedlearning.org/knowledge-surveys-for-student-self-assessment-of-learning/.

How can knowledge surveys strengthen students’ self-assessment and metacognition? This episode highlights a multi-semester study from the U.S. Air Force Academy on using knowledge surveys to help students better gauge their understanding and direct their learning. Read the full study in this open-access article:

Sloan, Joel, Timothy Frank, Lauren Scharff, and Karin Becker. 2025. “Knowledge Surveys: An Effective and Robust Student Self-Assessment and Learning Tool.” Teaching & Learning Inquiry 13: 1–20. https://doi.org/10.20343/teachlearninqu.13.47

This episode was hosted, edited, and produced by Jessie L. Moore, Director of the Center for Engaged Learning and Professor of Professional Writing & Rhetoric.

60-Second SoTL is produced by the Center for Engaged Learning at Elon University.

Music: “Cryptic” by AudioCoffee.

Reflection image in episode art by Freepik.

Episode Transcription

(Music)

0:10

Jessie L. Moore:

How can knowledge surveys strengthen students’ self-assessment and metacognition? That’s the focus of this week’s 60-second SoTL from Elon University’s Center for Engaged Learning. I’m Jessie Moore. 

(Music)

0:25

In “Knowledge Surveys: An Effective and Robust Student Self-Assessment and Learning Tool,” Joel Sloan, Timothy Frank, Lauren Scharff, and Karin Becker share their experience implementing knowledge surveys across a variety of courses required in a civil engineering program at the United States Air Force Academy. Their article appears in Teaching & Learning Inquiry, a diamond open access journal.

0:48

Knowledge surveys are self-assessment question sets specific to the learning objectives of the course, and they prompt students to reflect on their understanding of course material and related skills. Knowledge surveys typically align with Bloom’s taxonomy, explicitly identifying Bloom’s levels of cognitive domain associated with each self-assessment question. Because knowledge surveys focus on students’ confidence, these self-assessment tools can guide students’ decisions about which content they should spent more time reviewing and when they should seek additional help to support their learning.

As part of a larger research project, the U.S. Air Force Academy team embedded knowledge surveys in 18 sections of six courses in an engineering curriculum over 9 semesters, reaching 219 students. Courses included a general education course and elective and required courses in the civil engineering degree program. Courses used different assessment types—exams, design projects, and technical writing—and were taught in multiple modes—in-person, online, and hybrid—since the research included semesters impacted by the COVID-19 pandemic.

1:55

Faculty created knowledge surveys by turning each learning objective into an “I can…” statement, tagged with a Bloom’s taxonomy level. Students completed either comprehensive pre-course knowledge surveys—in early semesters of the study—or pre-unit knowledge surveys at the beginning of each unit—the researcher’s preferred timing as the study continued due to its closer proximity to related learning activities. Students then completed post-unit knowledge surveys within 24 hours before each exam or major assignment. The resulting self-assessment scores could be compared directly to instructor’s parallel assessments.

In addition to comparing student’s knowledge survey self-assessments to exam scores, faculty conducted surveys at mid- and end-of-semester to understand student perspectives on how the knowledge surveys supported their learning. Open-ended responses were analyzed thematically by two independent coders, who reconciled categories and checked interrater reliability.

2:52

So what did they find? Across courses, most knowledge survey scores were within +/- 10% of instructor grades, with error distributions centered near zero—suggesting that students’ post-unit self-assessments were generally accurate. Correlations between knowledge scores and exam scores were positive and statistically significant, and they increased over the span of the course, suggesting that students refined their self-assessment skills over time as they received feedback.

Student perceptions of the learning tool also were largely positive: They said knowledge surveys clarified expectations, served as a focused study guide, and helped them decide what to review. Many described using knowledge surveys in exactly the metacognitive ways the researchers hoped they would—checking their understanding and then adjusting their study strategies.

For instructors, knowledge surveys were relatively easy to implement, helped check alignment among objectives, activities, and assessments, and supported richer, data-informed conversations with students about their learning.

3:52

So what are the take-aways? Knowledge surveys are robust, flexible, and practical tools to help students practice self-assessment and build lifelong metacognitive habits.

To learn more about this study, visit our show notes for a link to the open access article, which also offers a rich example of a multi-semester, collaborative SoTL study.

4:11

(Music)

4:15

Jessie Moore:

Join us for our next episode of 60-second SoTL from Elon University’s Center for Engaged Learning for another snapshot of recent scholarship of teaching and learning. Learn more about the Center at www.CenterForEngagedLearning.org.

(Music)