Interactive Assessments of CT (IACT): Digital Interactive Logic Puzzles to Assess Computational Thinking in Grades 3–8

https://doi.org/10.21585/ijcses.v5i1.149

Authors

  • Elizabeth Rowe TERC
  • Jodi Asbell-Clarke
  • Mia Almeda
  • Santiago Gasca
  • Teon Edwards
  • Erin Bardar
  • Valerie Shute
  • Matthew Ventura

Keywords:

computational thinking, assessment, game-based learning, neurodiversity

Abstract

Background and Context: The Inclusive Assessment of Computational Thinking (CT) designed for accessibility and learner variability was studied in over 50 classes in US schools (grades 3-8).

Objective: The validation studies of IACT sampled thousands of students to establish IACT’s construct and concurrent validity as well as test-retest reliability.

Method: IACT items for each CT practice were correlated to examine construct validity. The CT pre-measures were correlated with post-measures to examine test-retest reliability. The CT post-measures were correlated with external measures to examine concurrent validity.

Findings: IACT studies showed moderate evidence of test-retest reliability and concurrent validity and low to moderate evidence of construct validity for an aggregated measure of CT, but weaker validity and reliability evidence for individual CT practices. These findings were similar for students with and without IEPs or 504s.

Implications: IACT is the first CT tool for grades 3-8 that has been validated in a large-scale study among students with and without IEPs or 504s. While improvements are needed for stronger validity, it is a promising start.

Downloads

Download data is not yet available.

References

Abdel-Khalek, A. M. (2005). Reliability and factorial validity of the standard progressive matrices among Kuwaiti children ages 8 to 15 years. Perceptual and Motor Skills, 101(2), 409–412.

Allan, W., Coulter, B., Denner, J., Erickson, J., Lee, I., Malyn-Smith, J., & Martin, F. (2010). Computational Thinking for Youth. ITEST Small Working Group on Computational Thinking.

American Association for the Advancement of Science. (1993). Benchmarks for science literacy. New York: Oxford University Press.

Asbell-Clarke, J., Rowe, E., Almeda, V., Edwards, T., Bardar, E., Gasca, S., Baker, R.S., & Scruggs, R. (2020). The Development of Students’ Computational Thinking Practices in Elementary- and Middle-School Classes using the Learning Game, Zoombinis. Computers in Human Behavior, https://doi.org/10.1016/j.chb.2020.106587

Barendsen, E., Mannila, L., Demo, B., Grgurina, N., Izu, C., Mirolo, C., ... & Stupurienė, G. (2015, July). Concepts in K-9 computer science education. In Proceedings of the 2015 ITiCSE on Working Group Reports (pp. 85–116). ACM.

Baron-Cohen, S., Ashwin, E., Ashwin, C., Tavassoli, T., & Chakrabarti, B. (2009). Talent in autism: hyper-systemizing, hyper-attention to detail and sensory hypersensitivity. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1522), 1377–1383.

Barr, V., & Stephenson, C. (2011). Bringing computational thinking to K-12: what is involved and what is the role of the computer science education community? ACM Inroads, 2(1), 48–54.

Brennan, K., & Resnick, M. (2012). New frameworks for studying and assessing the development of computational thinking. Paper presented at the Proceedings of the 2012 annual meeting of the American Educational Research Association, Vancouver, Canada.

Computer Science Teachers Association. (2017). CSTA K-12 Computer Science Standards. Retrieved from https://www.csteachers.org/page/standards.

Dagienė, V., & Futschek, G. (2008, July). Bebras international contest on informatics and computer literacy: Criteria for good tasks. In International conference on informatics in secondary schools-evolution and perspectives (pp. 19–30). Springer, Berlin, Heidelberg.

Dagienė, V., Stupurienė, G., & Vinikienė, L. (2016, June). Promoting inclusive informatics education through the Bebras challenge to all K-12 students. In Proceedings of the 17th International Conference on Computer Systems and Technologies 2016 (pp. 407–414). ACM.

Dawson, M., Soulières, I., Ann Gernsbacher, M., & Mottron, L. (2007). The level and nature of autistic intelligence. Psychological Science, 18(8), 657–662.

Dewey, J. (1938). Logic, the theory of inquiry. New York: Holt Pub.

Duschl, R. A. (1990). Restructuring science education: The importance of theories and their development. Teachers College Press.

González, M. R. (2015). Computational thinking test: Design guidelines and content validation. In Proceedings of EDULEARN15 conference (pp. 2436–2444).

Grover, S., & Basu, S. (2017, March). Measuring student learning in introductory block-based programming: Examining misconceptions of loops, variables, and Boolean logic. In Proceedings of the 2017 ACM SIGCSE technical symposium on computer science education (pp. 267–272). ACM.

Grover, S., Cooper, S., & Pea, R. (2014, June). Assessing computational learning in K-12. In Proceedings of the 2014 conference on Innovation & technology in computer science education (pp. 57–62). ACM.

Grover, S., & Pea, R. (2013). Computational Thinking in K–12 A Review of the State of the Field. Educational Researcher, 42(1), 38–43.

Haladyna, T. M., & Downing, S. M. (2004). Construct‐irrelevant variance in high‐stakes testing. Educational Measurement: Issues and Practice, 23(1), 17–27.

Horowitz, S. H., Rawe, J., & Whittaker, M. C. (2017). The State of Learning Disabilities: Understanding the 1 in 5. New York: National Center for Learning Disabilities.

Izu, C., Mirolo, C., Settle, A., Mannila, L., & Stupurienė, G. (2017). Exploring Bebras Tasks Content and Performance: A Multinational Study. Informatics in Education, 16(1), 39–59. https://files.eric.ed.gov/fulltext/EJ1140704.pdf.

Karalar, H., & Alpaslan, M. M. (2021). Assessment of Eighth Grade Students’ Domain-General Computational Thinking Skills. International Journal of Computer Science Education in Schools, 5(1), 35 - 47. https://doi.org/10.21585/ijcses.v5i1.126

Koh, K. H., Basawapatna, A., Nickerson, H., & Repenning, A. (2014, July). Real time assessment of computational thinking. In IEEE Symposium on Visual Languages and Human-Centric Computing (pp. 49–52). IEEE.

Kruit, P. M., Oostdam, R. J., van den Berg, E., & Schuitema, J. A. (2018). Assessing students’ ability in performing scientific inquiry: instruments for measuring science skills in primary education. Research in Science & Technological Education, 1–27.

Lundh, P., Grover, S., Jackiw, N., & Basu, S. (2018). Concepts Before Coding: Instructional Support for Introductory Programming Concepts in Middle School Computer Science. Annual Meeting of the American Education Research Association.

Martinuzzi, A., & Krumay, B. (2013). The good, the bad, and the successful–how corporate social responsibility leads to competitive advantage and organizational transformation. Journal of Change Management, 13(4), 424–443.

Mishra, P., Yadav, A., & Deep-Play Research Group. (2013). Rethinking technology & creativity in the 21st century. TechTrends, 57(3), 10–14.

Moreno-León, J., & Robles, G. (2015, November). Dr. Scratch: a Web Tool to Automatically Evaluate Scratch Projects. In WiPSCE (pp. 132-133). https://www.researchgate.net/profile/Jesus_Moreno-Leon/publication/284181364_Dr_Scratch_a_Web_Tool_to_Automatically_Evaluate_Scratch_Projects/links/564eccb508aefe619b0ff212.pdf.

National Academy of Sciences on Computational Thinking (2010). Report of a Workshop on The Scope and Nature of Computational Thinking. National Academies Press.

National Research Council (1996). National Science Education Standards. Washington, DC: The National Academies Press. p. 23. doi:10.17226/4962.

O’Leary, U. M., Rusch, K. M., & Guastello, S. J. (1991). Estimating age‐stratified WAIS‐R IQS from scores on the Raven’s standard progressive matrices. Journal of Clinical Psychology, 47(2), 277–284.

Ota, G., Morimoto, Y., & Kato, H. (2016, September). Ninja code village for scratch: Function samples/function analyser and automatic assessment of computational thinking concepts. In 2016 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC) (pp. 238–239). IEEE.

Papert, S. (1980). Mindstorms: Children, computers, and powerful ideas. Basic Books, Inc.

Papert, S. (1991). Situating constructionism. In I. Harel & S. Papert (Eds.), Constructionism (pp. 1-11). Norwood, NJ: Ablex.

Raven, J. C. (1981). Manual for Raven’s progressive matrices and vocabulary scales. Research supplement No.1: The 1979 British standardisation of the standard progressive matrices and mill hill vocabulary scales, together with comparative data from earlier studies in the UK, US, Canada, Germany and Ireland. San Antonio, TX: Harcourt Assessment.

Raven, J.C., 2000. The Raven’s progressive matrices: Change and stability over culture and time. Cognitive Psychology, 41(1), 1–48.

Raven, J., Raven, J. C., & Court, J. H. (2000). Manual for Raven’s progressive matrices and vocabulary scales. Section 3: The standard progressive matrices. Oxford, UK: Oxford Psychologists Press; San Antonio, TX: The Psychological Corporation.

Ritchhart, R., Church, M., & Morrison, K. (2011). Making thinking routines visible: How to promote engagement, understanding, and independence for all learners. San Francisco, CA: Jossey-Bass.

Román-González, M., Moreno-León, J., & Robles, G. (2017, July). Complementary tools for computational thinking assessment. In Proceedings of International Conference on Computational Thinking Education (CTE 2017), S. C Kong, J Sheldon, and K. Y Li (Eds.). The Education University of Hong Kong (pp. 154–159).

Sengupta, P., Kinnebrew, J. S., Basu, S., Biswas, G., & Clark, D. (2013). Integrating computational thinking with K-12 science education using agent-based computation: A theoretical framework. Education and Information Technologies, 18(2), 351–380.

Shute, V. J., Sun, C., & Asbell-Clarke, J. (2017). Demystifying computational thinking. Educational Research Review, 22, 142–158.

Tang, X., Yin, Y., Lin, Q., Hadad, R., & Zhai, X. (2020). Assessing computational thinking: A systematic review of empirical studies. Computers and Education, 148, [103798]. https://doi.org/10.1016/j.compedu.2019.103798

U.S. Department of Education, Office of Special Education Programs, Individuals with Disabilities Education Act (IDEA) database, retrieved July 26, 2016, from https://www2.ed.gov/programs/osepidea/618-data/state-level-data-files/index.html#bcc. See Digest of Education Statistics 2016, table 204.30.

von Wangenheim, C. G., Hauck, J. C., Demetrio, M. F., Pelle, R., da Cruz Alves, N., Barbosa, H., & Azevedo, L. F. (2018). CodeMaster--Automatic Assessment and Grading of App Inventor and Snap! Programs. Informatics in Education, 17(1), 117–150. https://files.eric.ed.gov/fulltext/EJ1177148.pdf.

Wang, S. How Autism Can Help You Land a Job. The Wall Street Journal, March 27, 2014.

Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L., & Wilensky, U. (2016). Defining computational thinking for mathematics and science classrooms. Journal of Science Education and Technology, 25(1), 127–147.

Weintrop, D., Killen, H., Munzar, T., & Franke, B. (2019, February). Block-based Comprehension: Exploring and Explaining Student Outcomes from a Read-only Block-based Exam. In Proceedings of the 50th ACM Technical Symposium on Computer Science Education (pp. 1218–1224). ACM.

Werner, L., Denner, J., Campe, S., & Kawamoto, D. C. (2012, February). The fairy performance assessment: measuring computational thinking in middle school. In Proceedings of the 43rd ACM technical symposium on Computer Science Education (pp. 215–220). ACM. https://www.cs.auckland.ac.nz/courses/compsci747s2c/lectures/wernerFairyComputationalThinkingAssessment.pdf.

Wiebe, E., London, J., Aksit, O., Mott, B. W., Boyer, K. E., & Lester, J. C. (2019, February). Development of a Lean Computational Thinking Abilities Assessment for Middle Grades Students. In Proceedings of the 50th ACM Technical Symposium on Computer Science Education (pp. 456–461). ACM. https://dl.acm.org/citation.cfm?id=3287390

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33–35.

Published

2021-12-17

How to Cite

Rowe, E., Asbell-Clarke, J., Almeda, M. V., Gasca, S., Edwards, T., Bardar, E., Shute, V., & Ventura, M. (2021). Interactive Assessments of CT (IACT): Digital Interactive Logic Puzzles to Assess Computational Thinking in Grades 3–8. International Journal of Computer Science Education in Schools, 5(2), 28–73. https://doi.org/10.21585/ijcses.v5i1.149