As research on the digital divide shifts away from questions about access and focuses instead on Digital Information Literacy (DIL) skills and the outcomes of productive use of such skills in digital contexts, we are faced with significant measurement challenges. To meet these challenges, complex, interactive simulation-based assessments have been developed that capture authentic learner performances. In the current study, we describe a multi-step modeling method for identifying distinct strategies captured within process data generated by students using a simulated web search tool within an inquiry task. The method considers the content, timing, and context of student actions. This approach identified meaningfully distinct strategies in students’ search processes, which were associated with differences in inquiry task performance.
Tenison, C., & Sparks, J. (2023). Combining cognitive theory and data driven approaches to examine students’ search behaviors in simulated digital environments. Large-scale Assessments in Education . 11, article 28. (Article available through open access)
Tenison, C., & Sparks, J. (2022). Clustering Student Strategies in a Simulated Web Search Environment. The Annual Meeting of the American Educational Research Association .(Extended abstract available here)