iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://doi.org/10.1007/978-3-031-42682-7_34
Automated Detection of Students’ Gaze Interactions in Collaborative Learning Videos: A Novel Approach | SpringerLink
Skip to main content

Automated Detection of Students’ Gaze Interactions in Collaborative Learning Videos: A Novel Approach

  • Conference paper
  • First Online:
Responsive and Sustainable Educational Futures (EC-TEL 2023)

Abstract

Gaze behaviours have been considered important social signals to explore human learning. Over the past decade, previous research showed positive relationships between certain features of gaze behaviours and the quality of collaborative learning. However, most studies focus on detecting students’ gaze behaviours with eye-tracking tools which are costly, logistically challenging, and can be obtrusive in real-world physical collaboration spaces. This study presents a novel approach to detecting students’ gaze behaviours from videos of real-world collaborative learning activities. Pre-trained computer vision models were used to detect objects on the scenes, students’ faces, and their gaze directions. Then, a rule-based approach was applied to detect gaze behaviours that are associated with peer communication and resource management aspects of collaborative learning. In order to test the accuracy of the proposed approach, twenty collaborative learning sessions, each lasting from 33 min to 67 min, from five groups in a 10-week-long higher education course were analysed. The results showed that the proposed approach achieves 66.57% overall accuracy at automatically detecting students’ gaze interactions in collaborative learning videos. The implications of these findings for supporting students’ collaborative learning in real-world technology-enhanced learning environments are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Spikol, D., Ruffaldi, E., Dabisias, G., Cukurova, M.: Supervised machine learning in multimodal learning analytics for estimating success in project-based learning. J. Comput. Assist. Learn. 34, 366–377 (2018). https://doi.org/10.1111/jcal.12263

    Article  Google Scholar 

  2. Spikol, D., Ruffaldi, E., Landolfi, L., Cukurova, M.: Estimation of success in collaborative learning based on multimodal learning analytics features. In: 2017 IEEE 17th International Conference on Advanced Learning Technologies (ICALT), pp. 269–273 (2017). https://doi.org/10.1109/ICALT.2017.122

  3. Cukurova, M., Luckin, R., Millán, E., Mavrikis, M.: The NISPI framework: analysing collaborative problem-solving from students’ physical interactions. Comput. Educ. 116, 93–109 (2018). https://doi.org/10.1016/j.compedu.2017.08.007

    Article  Google Scholar 

  4. Spikol, D., Ruffaldi, E., Cukurova, M.: Using multimodal learning analytics to identify aspects of collaboration in project-based learning. In: proceedings of the International Conference on Computer Supported Collaborative Learning (CSCL) (2017)

    Google Scholar 

  5. Cukurova, M., Zhou, Q., Spikol, D., Landolfi, L.: Modelling collaborative problem-solving competence with transparent learning analytics: is video data enough? In: Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, pp. 270–275. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3375462.3375484

  6. Martinez-Maldonado, R., Gaševic, D., Echeverria, V., Fernandez Nieto, G., Swiecki, Z., Buckingham Shum, S.: What do you mean by collaboration analytics? a conceptual model. J. Learning Analytics 8, 126–153 (2021)

    Article  Google Scholar 

  7. Wise, A.F., Knight, S., Shum, S.B.: Collaborative learning analytics. In: International handbook of computer-supported collaborative learning, pp. 425–443. Springer (2021). https://doi.org/10.1007/978-3-030-65291-3_23

  8. Monk, A.F., Gale, C.: A look is worth a thousand words: full gaze awareness in video-mediated conversation. Discourse Process. 33, 257–278 (2002)

    Article  Google Scholar 

  9. Gupta, K., Lee, G.A., Billinghurst, M.: Do you see what I see? the effect of gaze tracking on task space remote collaboration. IEEE Trans. Visual Comput. Graphics 22, 2413–2422 (2016)

    Article  Google Scholar 

  10. Staudte, M., Crocker, M.W.: Investigating joint attention mechanisms through spoken human–robot interaction. Cognition 120, 268–291 (2011)

    Article  Google Scholar 

  11. Schlösser, C., Schlieker-Steens, P., Kienle, A., Harrer, A.: Using real-time gaze based awareness methods to enhance collaboration. In: CYTED-RITOS International Workshop on Groupware, pp. 19–27. Springer (2015). https://doi.org/10.1007/978-3-319-22747-4_2

  12. D’Angelo, S., Gergle, D.: An eye for design: gaze visualizations for remote collaborative work. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–12 (2018)

    Google Scholar 

  13. Papavlasopoulou, S., Sharma, K., Giannakos, M., Jaccheri, L.: Using eye-tracking to unveil differences between kids and teens in coding activities. In: Proceedings of the 2017 Conference on Interaction Design and Children, pp. 171–181 (2017)

    Google Scholar 

  14. Sharma, K., Olsen, J., Verma, H., Caballero, D., Jermann, P.: Challenging joint visual attention as a proxy for collaborative performance: ISLS annual meeting 2021 (virtual): international society of the learning sciences. International Society of the Learning Sciences, Proceedings, pp. 91–98 (2021). https://doi.org/10.22318/cscl2021.91

  15. Fan, L., Wang, W., Huang, S., Tang, X., Zhu, S.-C.: Understanding human gaze communication by spatio-temporal graph reasoning. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 5724–5733 (2019)

    Google Scholar 

  16. D’angelo, S., Schneider, B.: Shared gaze visualizations in collaborative interactions: past, present and future. Interact. Comput. 33, 115–133 (2021)

    Article  Google Scholar 

  17. Zhou, Q., Suraworachet, W., Celiktutan, O., Cukurova, M.: What does shared understanding in students’ face-to-face collaborative learning gaze behaviours “look like”?. In: Rodrigo, M.M., Matsuda, N., Cristea, A.I., Dimitrova, V. (eds.) Artificial Intelligence in Education, pp. 588–593. Springer International Publishing, Cham (2022). https://doi.org/10.1007/978-3-031-11644-5_53

  18. Sung, G., Feng, T., Schneider, B.: Learners learn more and instructors track better with real-time gaze sharing. Proceedings of the ACM on Human-Computer Interaction 5, 1–23 (2021)

    Article  Google Scholar 

  19. Schneider, B., Pea, R.: Real-time mutual gaze perception enhances collaborative learning and collaboration quality. In: Educational Media and Technology Yearbook, pp. 99–125. Springer (2017)

    Google Scholar 

  20. Cheng, Y., Wang, H., Bao, Y., Lu, F.: Appearance-based gaze estimation with deep learning: A review and benchmark. arXiv preprint arXiv:2104.12668 (2021)

  21. Mukherjee, S.S., Robertson, N.M.: Deep head pose: gaze-direction estimation in multimodal video. IEEE Trans. Multimedia 17, 2094–2107 (2015)

    Article  Google Scholar 

  22. Robertson, N., Reid, I.: Estimating gaze direction from low-resolution faces in video. In: European conference on computer vision, pp. 402–415. Springer (2006). https://doi.org/10.1007/11744047_31

  23. Stiefelhagen, R., Zhu, J.: Head orientation and gaze direction in meetings. In: CHI’02 Extended Abstracts on Human Factors in Computing Systems, pp. 858–859 (2002)

    Google Scholar 

  24. Tonsen, M., Steil, J., Sugano, Y., Bulling, A.: Invisibleeye: mobile eye tracking using multiple low-resolution cameras and learning-based gaze estimation. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technol. 1, 1–21 (2017)

    Article  Google Scholar 

  25. Wright, R., Ellis, T., Makris, D.: Measuring inferred gaze direction to support analysis of people in a meeting. Expert Syst. Appl. 169, 114398 (2021). https://doi.org/10.1016/j.eswa.2020.114398

    Article  Google Scholar 

  26. Lian, D., et al.: RGBD based gaze estimation via multi-task CNN. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 2488–2495 (2019)

    Google Scholar 

  27. Recasens, A., Vondrick, C., Khosla, A., Torralba, A.: Following gaze in video. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1435–1443 (2017)

    Google Scholar 

  28. Kim, J., et al.: Nvgaze: An anatomically-informed dataset for low-latency, near-eye gaze estimation. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1–12 (2019)

    Google Scholar 

  29. Li, Y., Shen, W., Gao, Z., Zhu, Y., Zhai, G., Guo, G.: Looking here or there? gaze following in 360-degree images. In: 2021 IEEE/CVF International Conference on Computer Vision (ICCV), pp. 3722–3731. IEEE, Montreal, QC, Canada (2021). https://doi.org/10.1109/ICCV48922.2021.00372

  30. Huang, Q.: TabletGaze: dataset and algorithm for unconstrained appearance-based gaze estimation in mobile tablets. In: Machine Vision and Appl. 28(5–6), 445–461 (2015)

    Google Scholar 

  31. Richardson, D.C., Dale, R.: Looking to understand: the coupling between speakers’ and listeners’ eye movements and its relationship to discourse comprehension. Cogn. Sci. 29, 1045–1060 (2005). https://doi.org/10.1207/s15516709cog0000_29

    Article  Google Scholar 

  32. Jermann, P., Mullins, D., Nüssli, M.-A., Dillenbourg, P., Nuessli, M.-A.: Collaborative gaze footprints: correlates of interaction quality. In: International Conference on Computer Supported Collaborative Learning (2011)

    Google Scholar 

  33. Schneider, B., Pea, R.: Toward collaboration sensing. Intern. J. Comput.-Support. Collab. Learn. 9, 371–395 (2014). https://doi.org/10.1007/s11412-014-9202-y

    Article  Google Scholar 

  34. CrowdHuman Dataset. https://www.crowdhuman.org/. Accessed 16 Jan 2023

  35. Chong, E., Wang, Y., Ruiz, N., Rehg, J.M.: Detecting attended visual targets in video. In: 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 5395–5405. IEEE, Seattle, WA, USA (2020). https://doi.org/10.1109/CVPR42600.2020.00544

  36. Kasparova, A., Celiktutan, O., Cukurova, M.: Inferring student engagement in collaborative problem solving from visual cues. In: Companion Publication of the 2020 International Conference on Multimodal Interaction, pp. 177–181. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3395035.3425961

  37. Shi, X., Chen, Z., Wang, H., Yeung, D.-Y., Wong, W.-K., Woo, W.-C.: Convolutional lstm network: a machine learning approach for precipitation 5405 nowcasting. In: Advances in Neural Information Processing Systems, pp. 802–810

    Google Scholar 

  38. Holmes, W., et al.: Ethics of AI in education: towards a community-wide framework. International J. Artificial Intelligence Educ. pp. 1–23 (2021)

    Google Scholar 

  39. Stiefelhagen, R., Yang, J., Waibel, A.: Modeling focus of attention for meeting indexing based on multiple cues. IEEE Trans. Neural Networks 13(4), 928–938 (2002)

    Article  Google Scholar 

  40. Müller, P., Huang, M.X., Zhang, X., Bulling, A.: Robust eye contact detection in natural multi-person interactions using gaze and speaking behaviour. In: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, pp. 1–10 (2018)

    Google Scholar 

  41. Yan, L., Zhao, L., Gasevic, D., Martinez-Maldonado, R.: Scalability, sustainability, and ethicality of multimodal learning analytics. In: LAK22: 12th International Learning Analytics and Knowledge Conference, pp. 13–23 (2022)

    Google Scholar 

  42. Alwahaby, H., Cukurova, M., Papamitsiou, Z., Giannakos, M.: The evidence of impact and ethical considerations of multimodal learning analytics: a systematic literature review. The Multimodal Learning Analytics Handbook, pp. 289–325 (2022)

    Google Scholar 

  43. Sabuncuoglu, A., Sezgin, T.M.: Multimodal Group Activity Dataset for Classroom Engagement Level Prediction. arXiv preprint arXiv:2304.08901 (2023)

Download references

Acknowledgement

This research was partially funded by UCL-CDI AWS Doctoral Scholarship in Digital Innovation and the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101004676.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qi Zhou .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhou, Q., Bhattacharya, A., Suraworachet, W., Nagahara, H., Cukurova, M. (2023). Automated Detection of Students’ Gaze Interactions in Collaborative Learning Videos: A Novel Approach. In: Viberg, O., Jivet, I., Muñoz-Merino, P., Perifanou, M., Papathoma, T. (eds) Responsive and Sustainable Educational Futures. EC-TEL 2023. Lecture Notes in Computer Science, vol 14200. Springer, Cham. https://doi.org/10.1007/978-3-031-42682-7_34

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-42682-7_34

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-42681-0

  • Online ISBN: 978-3-031-42682-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics