Journal of Modern Foreign Psychology
2024. Vol. 13, no. 1, 58–68
doi:10.17759/jmfp.2024130105
ISSN: 2304-4977 (online)
Exploring the Relationship between Performance and Response Process Data in Digital Literacy Assessment
Abstract
Measuring complex latent constructs is challenging because of their multi-dimensionality. In this context, computer-based assessments have gained popularity due to its ability to handle large diverse data. The aim of the study is to investigate the interrelationship between performance, time, and actions in computer-based digital literacy assessment. The study involved more than 400 8th-grade schoolchildren (approximately 14—15 years old) from secondary schools in Russia. A subset was obtained from indicators capturing the demonstration of analysis of data, information, and digital content, which is a component of the information literacy in the digital literacy framework. The data was used to create latent models in the structural equation modeling framework. Confirmatory one-factor model for the Performance factor showed a good fit to the data (CFI=1; TLI=1; RMSEA=0). The model with dependencies among indicators demonstrated improved model fit (χ2(18)=510,65; p=0,05) compared to the model without such dependencies. The results suggest that performance, time, and actions are interdependent. The findings underscore the need for a comprehensive approach to assessing digital literacy that accounts for these interdependencies, as well as investigating behavioral patterns of interaction with a large amount of information in the digital environment.
General Information
Keywords: computer-based assessment, digital literacy, evidence-centered design, structural equation modeling, confirmatory factor analysis, process data, response time, clicks
Journal rubric: Educational Psychology and Pedagogical Psychology
Article type: scientific article
DOI: https://doi.org/10.17759/jmfp.2024130105
Funding. This work is supported by the Ministry of Science and Higher Education of the Russian Federation (Agreement No. 075-10-2021-093; Project COG-RND-2104).
Received: 31.03.2024
Accepted:
For citation: Tkachenko I.O., Tarasova K.V., Gracheva D.A. Exploring the Relationship between Performance and Response Process Data in Digital Literacy Assessment [Elektronnyi resurs]. Sovremennaia zarubezhnaia psikhologiia = Journal of Modern Foreign Psychology, 2024. Vol. 13, no. 1, pp. 58–68. DOI: 10.17759/jmfp.2024130105.
Full text
Introduction
Methods
Participants
Instrument
Data Analysis Strategy
Indicator |
Standardized factor loading |
the Performance factor |
|
t03_m01_p |
0,207* |
t03_m02p |
0,265* |
t03_m05_p |
0,298* |
t03_m07_p |
0,304* |
t07_m06_p |
0,419* |
t07_m02_p |
0,189* |
the Time factor |
|
t03_m01_t |
0,63* |
t03_m02_t |
0,653* |
t03_m05_t |
0,743* |
t03_m07_t |
0,702* |
t07_m06_t |
0,607* |
t07_m02_t |
0,288* |
the Action factor |
|
t03_m01_a |
0,373* |
t03_m02_a |
0,343* |
t03_m05_a |
0,205* |
t03_m07_a |
0,248* |
t07_m06_a |
0,41* |
t07_m02_a |
0,374* |
Indicator |
Dependences |
Standardized effect |
t03_m01 |
t03_m01_p → t03_m01_t |
0,014 |
t03_m01_p ~ t03_m01_a |
-0,1 |
|
t03_m01_a → t03_m01_t |
0,304* |
|
t03_m02 |
t03_m02_p → t03_m02_t |
0,044 |
t03_m02_p ~ t03_m02_a |
0,155* |
|
t03_m02_a → t03_m02_t |
0,194* |
|
t03_m05 |
t03_m05_p → t03_m05_t |
-0,087* |
t03_m05_p ~ t03_m05_a |
0,54 |
|
t03_m05_a →t03_m05_t |
0,336* |
|
t03_m07 |
t03_m07_p → t03_m07_t |
0,096* |
t03_m07_p ~ t03_m07_a |
-0,097 |
|
t03_m07_a → t03_m07_t |
0,22* |
|
t07_m06 |
t07_m06_p → t03_m06_t |
-0,04 |
t07_m06_p ~ t03_m06_a |
0,04 |
|
t07_m06_a → t03_m06_t |
0,218* |
|
t07_m02 |
t07_m02_p → t03_m02_t |
-0,221* |
t07_m02_p ~ t03_m02_a |
-0,443* |
|
t07_m02_a → t03_m02_t |
0,509* |
Discussion
Conclusion
References
- Avdeeva S., Tarasova K. Ob otsenke tsifrovoi gramotnosti: metodologiya, kontseptual'naya model' i instrument izmereniya [Digital Literacy Assessment: Methodology, Conceptual Model and Measurement Tool]. Voprosy obrazovaniya = Educational Studies (Moscow), 2023, no. 2, pp. 8—32. DOI:10.17323/1814-9545-2023-2-8-32 (In Russ.).
- Zhang S., Wang Z., Qi J., Liu J., Ying Z. Accurate assessment via process data. Psychometrika, 2023. Vol. 88, no. 1, pp. 76—97. DOI:10.1007/s11336-022-09880-8
- Bartolomé, J., Garaizar, P., & Bastida, L. (2020, October). Validating item response processes in digital competence assessment through eye-tracking techniques. In Eighth International Conference on Technological Ecosystems for Enhancing Multiculturality (pp. 738-746). DOI:10.1145/3434780.3436641
- Bergner Y., von Davier A.A. Process data in NAEP: Past, present, and future. Journal of Educational and Behavioral Statistics, 2019. Vol. 44, no. 6, pp. 706—732. DOI:10.3102/1076998618784700
- Hamari J., Shernoff D.J., Rowe E., Coller B., Asbell-Clarke J., Edwards T. Challenging games help students learn: An empirical study on engagement, flow and immersion in game-based learning. Computers in human behavior, 2016. Vol. 54, pp. 170—179. DOI:10.1016/j.chb.2015.07.045
- Cui Y., Chen F., Lutsyk A., Leighton J.P., Cutumisu M. Data literacy assessments: A systematic literature review. Assessment in Education: Principles, Policy & Practice, 2023. Vol. 30, no. 1, pp. 76—96. DOI: 10.1080/0969594x.2023.2182737
- De Boeck P., Scalise K. Collaborative problem solving: Processing actions, time, and performance. Frontiers in psychology, 2019. Vol. 10, article ID 1280, 9 p. DOI:10.3389/fpsyg.2019.01280
- Mislevy R.J., Behrens J.T., Dicerbo K.E., Levy R. Design and discovery in educational assessment: Evidence-centered design, psychometrics, and educational data mining. Journal of educational data mining, 2012. Vol. 4, no. 1, pp. 11—48. DOI:10.5281/zenodo.3554641
- Li J., Bai J., Zhu S., Yang H.H. Game-Based Assessment of Students’ Digital Literacy Using Evidence-Centered Game Design. Electronics, 2024. Vol. 13(2), Article ID 385, 19 p. DOI:10.3390/electronics13020385
- Hu L., Bentler P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural equation modeling: a multidisciplinary journal, 1999. Vol. 6, no. 1, pp. 1—55. DOI:10.1080/10705519909540118
- Nichols S.L., Dawson H.S. Assessment as a context for student engagement. In Christenson S.L., Reschly A.L., Wylie C. (eds.), Handbook of research on student engagement. Boston: Springer Science+Business Media, 2012, pp. 457—477. DOI:10.1007/978-1-4614-2018-7_22
- Oliveri M.E., Mislevy R.J. Introduction to “Challenges and opportunities in the design of ‘next-generation assessments of 21st century skills’” special issue. International Journal of Testing, 2019. Vol. 19, no. 2, pp. 97—102. DOI:10.1080/15305058.2019.1608551
- Peng D., Yu Z. A literature review of digital literacy over two decades. Education Research International, 2022. Vol. 2022, article ID 2533413, 8 p. DOI:10.1155/2022/2533413
- OECD. Recommendation of the Council on Children in the Digital Environment [Electronic resource]. Paris: OECD, 2022. 14 p. URL: https://legalinstruments.oecd.org/public/doc/272/272.en.pdf (Accessed 26.02.2024).
- Laanpere M., UNESCO, UNESCO Institute for Statistics. Recommendations on assessment tools for monitoring digital literacy within unesco’s digital literacy global framework. Montreal: UNESCO Institute for Statistics, 2019. 23 p. DOI:10.15220/2019-56-en
- Domingue B.W., Kanopka K., Stenhaug B. et al. Speed—Accuracy Trade-Off? Not So Fast: Marginal Changes in Speed Have Inconsistent Relationships with Accuracy in Real-World Settings. Journal of Educational and Behavioral Statistics, 2022. Vol. 47, no. 5, pp. 576—602. DOI:10.3102/10769986221099906
- Teig N., Scherer R., Kjærnsli M. Identifying patterns of students' performance on simulated inquiry tasks using PISA 2015 log‐file data. Journal of Research in Science Teaching, 2020. Vol. 57, no. 9, pp. 1400—1429. DOI:10.1002/tea.21657
- Heinonen J., Aro T., Ahonen T., Poikkeus A.-M. Test-taking behaviors in a neurocognitive assessment: Associations with school-age outcomes in a Finnish longitudinal follow-up. Psychological assessment, 2011. Vol. 23, no. 1, pp. 184—192. DOI:10.1037/a0021291
- Yu R., Li Q., Fischer C., Doroudi S., Xu D. Towards Accurate and Fair Prediction of College Success: Evaluating Different Sources of Student Data [Electronic resource]. In Rafferty A.N., Whitehill J., Romero C., Cavalli-Sforza V. (eds.), Proceedings of the 13th International Conference on Educational Data Mining, EDM 2020, Fully virtual conference (July 10—13, 2020). Montreal: International educational data mining society, 2020, pp. 292—301. URL: https://files.eric.ed.gov/fulltext/ED608066.pdf (Accessed 26.02.2024).
- Zumbo B.D., Hubley A.M. (eds.). Understanding and investigating response processes in validation research. Cham: Springer International Publishing, 2017. 383 p. DOI:10.1007/978-3-319-56129-5
- Ercikan K., Pellegrino J.W. (eds.). Validation of score meaning for the next generation of assessments: The use of response processes. N.Y.; London: Taylor & Francis, 2017. 165 p.
- Andrews-Todd J., Mislevy R.J., LaMar M., de Klerk S. Virtual performance-based assessments. In von Davier A.A., Mislevy R.J., Hao J. (eds.), Computational Psychometrics: New Methodologies for a New Generation of Digital Learning and Assessment: With Examples in R and Python. Berlin: Springer, 2021, pp. 45—60.
- Vuorikari R., Stefano K., Yves P. DigComp 2.2: The Digital Competence Framework for Citizens-With new examples of knowledge, skills and attitudes. Luxembourg: Publications Office of the European Union, 2022. 133 p. DOI:10.2760/115376
- Wang J., Wang X. Structural equation modeling: Applications using Mplus. New Jersey: John Wiley & Sons, 2019. 536 p. DOI:10.1002/9781119422730
- Wirth J. Computer-based tests: Alternatives for test and item design. In Hartig J., Klieme E., Leutner D. (eds.), Assessment of competencies in educational contexts. Göttingen: Hogrefe & Huber Publishers, 2008, pp. 235—252.
- Yamamoto K., Lennon M.L. Understanding and detecting data fabrication in large-scale assessments. Quality Assurance in Education, 2018. Vol. 26, no. 2, pp. 196—212. DOI:10.1108/QAE-07-2017-0038
- Zumbo B.D., Maddox B., Care N.M. Process and product in computer-based assessments: Clearing the ground for a holistic validity framework. European Journal of Psychological Assessment, 2023. Vol. 39, no. 4, pp. 252—262. DOI:10.1027/1015-5759/a000748
Information About the Authors
Metrics
Views
Total: 225
Previous month: 25
Current month: 16
Downloads
Total: 85
Previous month: 14
Current month: 5